Advertisement

Results of AI vary for radiologists from helping to hindering accuracy

By Dennis Thompson, HealthDay News
The benefits of AI vary from doctor to doctor, and in some cases it can interfere with a radiologist's performance and accuracy, a new study shows. Photo by Adobe Stock/HealthDay News
The benefits of AI vary from doctor to doctor, and in some cases it can interfere with a radiologist's performance and accuracy, a new study shows. Photo by Adobe Stock/HealthDay News

Artificial intelligence tools don't always help radiologists better review a patient's X-rays or CT scans, a new study claims.

AI has been touted as a potential means of improving doctors' ability to interpret medical images, the researchers said.

Advertisement

However, the benefits of AI vary from doctor to doctor, and in some cases it can interfere with a radiologist's performance and accuracy, results show.

"We find that different radiologists, indeed, react differently to AI assistance - some are helped, while others are hurt by it," said co-senior researcher Pranav Rajpurkar, an assistant professor of biomedical informatics in the Blavatnik Institute at Harvard Medical School.

The results indicate that AI developers need to better understand how doctors interact with the programs, researchers said. That way, AI can be adjusted to boost human performance rather than hinder it.

"What this means is that we should not look at radiologists as a uniform population and consider just the 'average' effect of AI on their performance," Rajpurkar said in a Harvard news release. "To maximize benefits and minimize harm, we need to personalize assistive AI systems."

Advertisement

For the study, researchers tracked how AI affected the performance of 140 radiologists in 15 diagnostic tasks associated with X-ray images.

The radiologists were asked to assess 324 different patient cases involving 15 diseases that could be observed in chest X-rays, researchers said.

Researchers used advanced computer metrics to compare how doctors spotted and correctly identified diseases both with and without the assistance of AI.

They found that the effect of AI assistance was inconsistent and varied across radiologists. The programs improved performance in some doctors, and worsened it in others.

Surprisingly, researchers found that a doctor's personal experience in radiology did not reliably predict how well AI would help or hinder them.

Factors such as how many years of experience, specialization in chest radiology and whether they'd used AI before didn't consistently influence a doctor's performance, results showed.

Additionally, doctors who were not good at reading X-rays did not benefit consistently from AI assistance. Overall, lower-performing radiologists read X-rays poorly with or without AI, researchers found.

However, researchers found that more accurate AI tools tended to boost doctors' performance, while programs that aren't as good can get in the way of a proper diagnosis.

The study wasn't designed to explain why AI tools aren't always helpful to radiologists, researchers noted.

Advertisement

The new study was published Tuesday in the journal Nature Medicine.

The research team said AI programmers would do well to work with doctors who use their tools to make them more helpful, and to test the tools in experiments that can better hone their effectiveness.

Importantly, AI developers should make sure that their programs can "explain" their decisions, so doctors will be able to better detect inaccurate diagnoses, researchers said.

"Our research reveals the nuanced and complex nature of machine-human interaction," said co-senior author Nikhil Agarwal, a professor of economics at MIT. "It highlights the need to understand the multitude of factors involved in this interplay and how they influence the ultimate diagnosis and care of patients."

More information

NYU Langone Health has more about AI in imaging.

Copyright © 2024 HealthDay. All rights reserved.

Latest Headlines