Gen AI and private equity – what are the opportunities and risks?

Gen AI and private equity – what are the opportunities and risks?

Gen AI and private equity – what are the opportunities and risks? | Insurance Business America

Technology

Gen AI and private equity – what are the opportunities and risks?

Unlimited potential can often come with “liabilities and complications”

Technology

By
Kenneth Araullo

Generative AI holds significant promise for private equity, but it also brings potential risks that firms must consider. According to Cheryl Kosche (pictured above), senior vice president and senior client manager at Lockton, private equity professionals are not only investing heavily in generative AI companies but are also integrating the technology into their day-to-day business operations.

“As the industry continues to embrace ways to use AI, however, PE firms must be fully aware of the potential liabilities and complications it can present,” Kosche said.

Investment-related AI tools are delivering value to private equity firms, Kosche said. These tools provide rapid access to robust market analytics, facilitating comprehensive due diligence and better-informed valuations.

“AI can also enable significant efficiencies in logistics, as well as any repetitive task or data analysis need. This can help reduce costs and preserve a private equity firm’s cash and human resources, which can be redeployed to other areas of the business to help drive growth,” she said.

However, Kosche warned of regulatory bodies like the Securities and Exchange Commission (SEC) that are closely monitoring private equity’s use of AI.

“With regulators hyper-focused on private equity, it’s vital that you examine internal processes related to AI at the fund level, understand potential AI-related risks that portfolio companies might bring, and have the right insurance program in place to mitigate your risk,” Kosche said.

See also  Markel Group adds COO function to EVP's remit

Responsible AI use is key

Kosche noted that private equity firms must articulate their organizational philosophy and guidelines on responsible AI use. This includes addressing concerns like AI washing, where firms falsely claim to use AI in their investment strategies, and potential conflicts of interest, such as training AI to prioritize the firm’s interests over clients.

“The private equity world has historically considered data, processes, algorithms, and products to be proprietary intellectual property (whether by trade secret, copyright or patent), and fiercely guarded them as a result. Emerging case law and regulations, however, maintain that generative-AI-assisted works are generally not proprietary,” she said.

Antitrust concerns are also relevant, Kosche said, as AI use is subject to the Sherman Act. The Department of Justice and private plaintiffs can litigate if AI is used to create an unfair competitive advantage.

“With the ‘Club Deal’ litigation still in recent memory, private equity firms should be particularly aware of this exposure,” she said.

Ethical considerations also come into play when AI increases efficiency but potentially displaces the workforce. Kosche says that the private equity industry needs to consider retraining displaced workers to avoid reputational risks.

“Likewise, the use of AI in prioritizing and selecting candidates also raises the question of how AI is being taught to make appropriate and qualified candidate selections without discriminating based on age, gender, race/ethnicity, sexual orientation, and other factors,” Kosche said.

Sweeping for AI risks and exposures

Kosche notes that assessing AI exposure is complex due to limited precedents and numerous unintended consequences. One notable risk comes from legislation like Illinois’ Biometric Information Privacy Act (BIPA) and similar laws in nearly 30 other states.

See also  HK insurance gross premiums down 7% in Q1

“With AI technologies increasingly being embedded in biometric data applications, such as voice and face recognition software, the potential exposures for companies are vast,” she said. “So, it’s critical that private equity firms understand how portfolio companies are using these tools and if they are doing so in Illinois and other states where biometric privacy laws have been passed.”

Given AI’s relatively uncharted legal and regulatory territory, firms also need to be prepared for unforeseen lawsuits or regulatory actions. When evaluating fund-level insurance programs, such as general partnership liability, cyber liability, fidelity bond, and others, Kosche said that firms should consider how these cover AI risks.

“For example, will they respond to alleged antitrust violations, BIPA violations, intellectual property disputes, regulatory investigations, and claims of discrimination, conflicts of interest, and AI washing?” she said. “Do they contain express coverage for these areas, or do they have exclusions or limitations that might restrict or remove coverage altogether? How do the various policies coordinate with one another if a claim implicates multiple policies at the portfolio company level, or at the port company and fund level?”

As private equity firms explore AI’s potential, they must also navigate the associated risks, and this is where brokers like Lockton come into play, Kosche said.

“At Lockton, our team of experienced alternative investment insurance brokers is available to help review your current insurance program and provide recommendations to maximize claims recovery, giving you peace of mind and allowing you to spend that energy on other areas of your business,” she said.

See also  Lloyd’s head on the answer to a successful culture strategy

What are your thoughts on this story? Please feel free to share your comments below.

Related Stories

Keep up with the latest news and events

Join our mailing list, it’s free!