Airmic poll reveals lack of AI risk assessments among firms

Airmic poll reveals lack of AI risk assessments among firms

Airmic poll reveals lack of AI risk assessments among firms | Insurance Business America

Risk Management News

Airmic poll reveals lack of AI risk assessments among firms

Updated methodology to be produced

Risk Management News

By
Terry Gangcuangco

A recent survey conducted among Airmic members has shed light on a concerning gap in risk management practices related to artificial intelligence (AI).

Conducted on February 26, the poll found that up to half of organisations have yet to perform risk assessments for AI technologies. For those who have examined the risks, data protection and intellectual property emerged as the primary areas of concern.

Other threats include the risk of making decisions based on wrong information, as well as ethical risks and those that relate to bias and discrimination.

“Research indicates that most organisations, when they do conduct an AI risk assessment, are using traditional risk assessment frameworks better suited to the pre-AI world of assessment – this is an area of risk management still in its infancy for many,” Graham stated, highlighting the inadequacy of existing frameworks to address the unique challenges posed by AI.

Meanwhile Hoe-Yeong Loke, Airmic’s head of research, noted how governments are responding in terms of AI regulation.

“Many governments are just beginning to develop policies and laws specific to AI, while those that have are competing to put their stamp on how this emerging technology will develop,” the research head said.

“Understandably, there is no universally accepted model for assessing AI risk, but risk professionals can look to recent published standards such as ISO/IEC 23894:2023 Artificial intelligence: Guidance on risk management.”

See also  AXA XL promotes Ana Dores to Chief Underwriting Officer, International Financial Lines – APAC Europe

In response to the findings, Airmic announced plans to develop an updated methodology for AI risk assessments. The initiative will involve collaboration with Airmic members and industry stakeholders, aiming to craft a framework that fits the unique risks associated with AI technologies.

What do you think about this story? Share your thoughts in the comments below.

Keep up with the latest news and events

Join our mailing list, it’s free!