New York DFS zones in on insurance AI discrimination

New York DFS zones in on insurance AI discrimination

New York DFS zones in on insurance AI discrimination | Insurance Business America

Technology

New York DFS zones in on insurance AI discrimination

New York State regulator flags “significant concerns,” proposes guidance

The New York State Department of Financial Services (NYDFS) has raised “significant concerns” over the potential for insurance AI and external consumer data use to result in discrimination and set out that guidance is needed to ensure the technology does not drive unfair outcomes.

In a letter sent to insurers, dated January 17, the NYDFS zoned in on underwriting and pricing, warning of the risk of “unfair adverse” effects stemming from the use of AI and external consumer data and information sources (ECDIS).

While it acknowledged that AI and ECDIS could simplify and expedite processes, the NYDFS pointed to the risk of systemic biases in ECDIS that could “reinforce and exacerbate inequality”.

Additionally, it sounded the alarm on “variable accuracy and reliability” issues, with ECDIS potentially coming from entities that are not subject to regulatory oversight.

“Furthermore, the self-learning behavior of AIS increases the risks of inaccurate, arbitrary, capricious, or unfairly discriminatory outcomes that may disproportionately affect vulnerable communities and individuals or otherwise undermine the insurance marketplace in New York,” the NYDFS said in the circular letter to insurers, fraternal benefit societies, and the New York State Insurance Fund.

It is imperative that insurers that use ECDIS and AI systems technologies establish proper governance and risk management frameworks to minimize consumer harm risk and ensure they are legally compliant, the NYDFS set out.

See also  Is Geico really the cheapest?

The NYDFS has sought feedback on proposed guidance by March 17, 2024.

NYDFS reminds insurers of insurance AI and ECDIS third-party vendor risk

Insurers were reminded that they are responsible for compliance with anti-discrimination laws irrespective of whether they collect data and directly underwrite or use external vendors for ECDIS or AI services.

“An insurer may not use ECDIS or AIS to collect or use information that the insurer would otherwise be prohibited from collecting or using directly,” the NYDFS said in the letter. “An insurer may not rely solely on a vendor’s claim of non-discrimination or a proprietary third-party process to determine compliance with anti-discrimination laws.”

Insurers should be able to comprehensively demonstrate that any use of ECDIS and AI does not result in unfair or unlawful discrimination, the NYDFS said.

NYDFS lays out guidance to ensure insurance AI compliance and prevent discrimination

Compliance efforts should include thorough documentation, regular testing, in addition to quantitative and qualitative assessments, as per the NYDFS.

Corporate governance frameworks should also provide appropriate oversight, the regulator set out.

It also pointed to the importance of risk management and internal controls and a need for insurers to provide transparency around where AI has been used in underwriting and pricing.

NYDFS not alone in targeting insurance AI potential harms

New York State regulators are not alone in focusing in on insurance AI risk.

See also  MS Amlin owner declares full-year financials

The Colorado Division of Insurance (CDI) last fall adopted regulation governing ECDIS and AI use in life insurance, effective from November 2023 and at the time hailed as a landmark step.

Insurers must “avoid both conscious and unconscious bias or discrimination that can and often does result from the use of artificial intelligence”, Lara said in a June 2022 bulletin.

“As the insurance sector navigates the complexities of AI, the NAIC’s Model Bulletin on the Use of Artificial Intelligence Systems by Insurers provides a robust foundation to safeguard consumers, promote fairness, and uphold the highest standards of integrity within the industry,” Maryland Insurance Commissioner and NAIC Innovation, Cybersecurity, and Technology Committee chair Kathleen A. Birrane said at the time.

Got a view on insurance AI use and the NYDFS letter on discrimination potential? Leave a comment below.

Related Stories

Keep up with the latest news and events

Join our mailing list, it’s free!