Study shows ‘alarming’ level of trust in AI for life and death decisions

A US study that simulated life and death decisions has shown that humans place excessive trust in artificial intelligence when guiding their choices.

Adobe Stock

Having been briefly shown a list of eight target photos marked either friend or foe, test subjects then had to make rapid decisions on whether to carry out simulated assassinations on individual targets via a drone strike. A second opinion on the validity of the targets was given by AI. Unbeknownst to the humans, the AI advice was completely random. 

MORE ON AI

Despite being informed of the fallibility of the AI systems in the study, two-thirds of subjects allowed their decisions to be influenced by the AI. The work, conducted by scientists at the University of California – Merced, is published in Scientific Reports.

“As a society, with AI accelerating so quickly, we need to be concerned about the potential for overtrust,” said principal investigator Professor Colin Holbrook, a member of UC Merced’s Department of Cognitive and Information Sciences.

According to Holbrook, the study’s design was a means of testing the broader question of putting too much trust in AI under uncertain circumstances. He said the findings are not just about military decisions and could be applied to contexts such as police being influenced by AI to use lethal force or a paramedic being swayed by AI when deciding who to treat first in a medical emergency. It’s claimed the findings could also be applicable to major life decisions such as buying a home.

Register now to continue reading

Thanks for visiting The Engineer. You’ve now reached your monthly limit of news stories. Register for free to unlock unlimited access to all of our news coverage, as well as premium content including opinion, in-depth features and special reports.  

Benefits of registering

  • In-depth insights and coverage of key emerging trends

  • Unrestricted access to special reports throughout the year

  • Daily technology news delivered straight to your inbox