How good would an algorithm have to be to take care of your job?
It’s a new question for many workers amid the rise of ChatGPT and other artificial intelligence programs that can hold conversations, write stories and even generate songs and images in seconds.
However, for doctors who review scans to detect cancer and other diseases, AI has been around for about a decade as more algorithms promise to improve accuracy, speed up work and, in some cases, take over entire chunks of the work. . Predictions have ranged from doomsday scenarios in which AI completely replaces radiologists, to sunny futures in which it allows them to focus on the more rewarding aspects of their work.
AI TECHNOLOGY AIMS TO DETECT BREAST CANCER BY MIMITING THE EYE MOVEMENTS OF RADIOLOGY: ‘A CRITICAL FRIEND’
That tension reflects how AI is being implemented in healthcare. Beyond the technology itself, much depends on doctors’ willingness to put their trust (and their patients’ health) in the hands of increasingly sophisticated algorithms that few understand.
Even within the field, opinions differ on how much radiologists should embrace this technology.
“Some of the AI techniques are so good that, frankly, I think we should use them now,” said Dr. Ronald Summers, a radiologist and AI researcher at the National Institutes of Health. “Why do we let that information sit on the table?”
Summers’ lab has developed computer-assisted imaging programs that detect colon cancer, osteoporosis, diabetes and other conditions. None of them have been widely adopted, which he attributes to the “culture of medicine,” among other factors.
Radiologists have used computers to enhance images and pinpoint suspicious areas since the 1990s. But the latest AI programs can go much further, interpreting scans, offering a diagnosis and even writing written reports about their findings. The algorithms are typically based on millions of X-rays and other images collected in hospitals and clinics.
Across medicine, the FDA has approved more than 700 AI algorithms to help doctors. More than 75% of them work in radiology, but only 2% of radiology practices use such technology, according to a recent estimate.
Despite all the industry’s promises, radiologists see a number of reasons to be skeptical of AI programs: limited testing in real-world settings, lack of transparency about how they work, and questions about patient demographics. used to train them.
ARTIFICIAL INTELLIGENCE IS NOT ALWAYS USEFUL IN REDUCE DOCTOR BURNOUT, STUDIES SUGGEST
“If we don’t know which cases AI was tested on, or whether those cases are similar to the types of patients we see in our practice, everyone is asking the question of whether these cases are going to work for us,” said Dr. Curtis. Langlotz, a radiologist who runs an AI research center at Stanford University.
To date, all FDA-approved programs require a human being to be in the loop.
In early 2020, the FDA held a two-day workshop to discuss algorithms that could work without human supervision. Shortly afterward, radiology professionals warned regulators in a letter that they “strongly believe that it is premature for the FDA to consider approval or clearance” of such systems.
But European regulators in 2022 approved the first fully automated software that reviews and writes reports of chest X-rays that appear healthy and normal. The company behind the app, Oxipit, is submitting its US application to the FDA.
The need for such technology in Europe is urgent, with some hospitals facing months-long delays in scans due to a shortage of radiologists.
In the United States, that kind of automated detection is likely years away. Not because the technology isn’t ready, according to AI executives, but because radiologists are not yet comfortable handing over even routine tasks to algorithms.
“We try to tell them that they are treating people too much and that they are wasting a lot of time and resources,” said Chad McClennan, CEO of Koios Medical, which sells an artificial intelligence tool for thyroid ultrasounds, the vast majority of which are non-cancerous. “We tell them: ‘Let the machine look at it, you sign the report and that’s it.'”
Radiologists tend to overestimate their own accuracy, McClennan says. Research by his company found that doctors who saw the same breast scans disagreed with each other more than 30% of the time about whether to perform a biopsy. The same radiologists even disagreed with their own initial assessments 20% of the time, when they saw the same images a month later.
According to the National Cancer Institute, about 20% of breast cancers go undetected during routine mammograms.
And then there is the possibility of saving costs. On average, American radiologists earn more than $350,000 a year, according to the Department of Labor.
In the short term, experts say AI will function like autopilot systems on airplanes, performing important navigation functions but always under the supervision of a human pilot.
That approach offers peace of mind to both radiologists and patients, says Dr. Laurie Margolies of the Mount Sinai hospital system in New York. The system uses Koios Breast Imaging AI to get a second opinion on mammography ultrasounds.
“I tell patients, ‘I looked at it, the computer looked at it, and we both agree,'” Margolies said. “To hear me say that we both agree, I think that gives the patient an even greater level of confidence.”
The first large, rigorous trials comparing AI-assisted radiologists with those working alone provide clues to potential improvements.
Initial results from a Swedish study of 80,000 women showed that a single radiologist working with AI detected 20% more cancers in mammograms than two radiologists working without the technology.
In Europe, mammograms are reviewed by two radiologists to improve accuracy. But Sweden, like other countries, faces a labor shortage, with only about 70 breast radiologists in a country of 10 million people.
According to the study, using AI instead of a second reviewer reduced human workload by 44%.
Still, the study’s lead author says it is essential that a radiologist make the final diagnosis in all cases.
If an automated algorithm misses a cancer, “it will be very bad for caregiver trust,” said Dr. Kristina Lang of Lund University.
The question of who would be held liable in such cases is one of the thorny legal questions that still needs to be resolved.
One result is that radiologists are likely to continue verifying all AI determinations, so as not to be held responsible for an error. That is likely to eliminate many of the intended benefits, including reduced workload and burnout.
CLICK HERE TO GET THE FOX NEWS APP
Only an extremely accurate and reliable algorithm would allow radiologists to truly step away from the process, says Dr. Saurabh Jha of the University of Pennsylvania.
Until such systems emerge, Jha compares AI-assisted radiology to someone offering to help you drive by looking over your shoulder and constantly pointing out everything on the road.
“That doesn’t help,” Jha says. “If you want to help me drive, take over the driving so I can sit back and relax.”