Only weeks after discovery of the X-ray in 1885, two physicians in the United Kingdom used the technology to reveal a needle lodged in a woman’s hand. Physicians and scientists had their first glimpse of the inner bodies of living patients, and it was immediately clear that the tool would transform medicine. Radiology has been a cornerstone of modern healthcare ever since.
Another revolution is around the corner. In the last 40 years, computers have played a central role in advancing medical imaging, and today virtually all medical images are digitally rendered, stored, and transmitted. Now, thanks to huge gains in compute power, artificial intelligence is starting to automate diagnosis, promising to democratize medicine.
This is only possible because of great leaps forward in processing speed, which already allow doctors to review complex images with ease. For example, Siemens Healthineers’ syngo.via* visualization software crunches vast amounts of imaging data, rendering scans to produce strikingly realistic 3-D images.1 With an upgrade to the next-generation chip—the Intel® Xeon® Gold 6148 processor, released only one year later—these incredibly detailed images are rendered 40 percent faster.
“Our daily work has been interacting with very sophisticated computer software, such as Siemens Healthineers' syngo.via, that is now doing more than just putting a picture in front of us--it makes looking at these large image files easier, more intuitive, and faster,” said Dr. Nick Bryan, director of diagnostics for the Dell Medical School at the University of Texas at Austin.
“The software now makes more quantitative measurements, like how narrow is a vessel, how big are the ventricles. These measurements are becoming increasingly important components of the final report.”
Now that software automates more measurements--providing richer, readable data--machine learning models can be trained to classify discrete image features, using historical inputs and statistics to indicate a likely diagnosis. Radiologists sit on the richest data stockpile in all of medicine to train their algorithms: Medical images make up 90 percent of all data in the healthcare industry.2
“Despite many challenges, clearly there’s a lot of money going into AI solutions for medical imaging,” said Daniel Faggella of San Francisco–based business intelligence firm TechEmergence. “The sheer size of the addressable market makes healthcare and finance float to the top in terms of where AI can make a difference.”
Augmenting, not replacing
Yet radiologists can rest assured their jobs are safe. Artificial intelligence is only able to answer very narrow questions, at least for now.
For example, machine learning tools can identify whether a tumor has grown or not, or whether the ventricles in a hydrocephalus patient are bigger or smaller than they were on the previous examination; they can’t forecast a teenager’s likelihood of getting Parkinson's by reading a CT scan.
“Some people have predicted that deep learning will replace radiologists in 10 years, but the reality is that it will only augment them,” said John Axerio-Cilies, chief operating officer of Arterys, which applies deep learning analysis to medical images via the cloud. “It will allow them to move faster to focus on what is important, and let the computer focus on all the repetitive, tedious tasks.”
Not all progress is so humble. In some areas, AI tools have a better diagnostic record than radiologists. Deep learning–based methods can be faster and more accurate than clinicians at detecting lesions or tumors in certain parts of the body, particularly when compared to fatigued or less experienced clinicians. Diagnostic errors affect about 5 percent of all patients in outpatient care in the United States, about 12 million people.3
“Radiologists make human mistakes. We overlook an observation, we are busy or distracted and miss something, forget about things, said Bryan. “We tend to remember diseases that we’ve seen recently and forget ones we haven’t seen for a long time.”
Bryan explains that computers are starting to do what residents and fellows do now in an academic practice--a lot of the unglamorous work that saves radiologists time, and keeps them in check. In this regard, AI is already doing a pretty good job.
“There are a lot of algorithms that are validated to work at about the level of our fellows.”
Illuminating the ‘black box’
Although radiologists need not fear for their jobs, they must learn to cooperate with their machine colleagues. Many radiologists are already training machine learning algorithms—tedious data entry work that requires enormous amounts of annotation, matching images with labels and outcomes.
But the hard work promises to pay off. Such work is necessary to avoid the classic “black box” problem--where machine learning tools offer predictions without explanation--and increases the precision and sophistication of the AI. This is allowing radiologists to take on increasingly ambitious projects.
“It’s pretty easy to diagnose the really abnormal tumors--anyone can find that mass on a scan for the most part. But how can we confidently say that there’s nothing wrong with you so that you can go home and not worry?” asks Dr. Safwan Halabi, clinical assistant professor of radiology at Stanford.
He says that the challenge is to define what is normal. What does a 50-year-old brain look like, or a normal fetus? What’s needed is more imaging data, partnered with genetic data, medical histories, and other information. But the payoffs would be enormous, as artificial intelligence could become truly predictive.
“I have colleagues here that are developing informatics consults of what’s the right drug to give you with your genetic background, your health history, your smoking history,” Halabi says. “What is the right drug to give you for hypertension when there are 50-100 drugs for blood pressure? We can only get to that with data analysis.”
While such solutions may be years, if not decades, away, there are pressing reasons to adopt the machine learning tools available now.
Easing the burden on radiologists and budgets
Radiologists are under a great deal of pressure now to read more examinations. And despite major efforts to control and restrict the use of medical imaging, the number of exams continues to increase.4 This problem is particularly acute in countries with aging populations.
Healthcare policy also puts pressure on imaging costs, which may benefit companies promising to lower radiology costs with AI.
“The single-payer healthcare systems face significant cost pressures that make them more receptive to this type of technology--and they have the ability to implement system-wide solutions,” says Sally Daub, CEO of San Francisco–based Enlitic, which uses deep learning to support clinical decisions. She expects to see solutions for specific use cases deployed across public health systems outside the U.S. within 24 months.
“AI technology like Enlitic’s won’t only augment and enhance the work of radiologists, but also improve the workflow for admitting, seeing, prioritizing, and discharging patients, helping to eliminate expensive logjams in the system.”
AI could also play a role in the developing world, offering imaging diagnostics in areas where there is a chronic lack of radiologists, such as Sub-Saharan Africa and China.
While radiology has become a notoriously cost-intensive specialty, AI-driven technologies promise to benefit more people than ever before. AI will help improve diagnostic accuracy, lower medical costs, and support testing for more of the population.
This promises to deliver the dream of the X-ray’s inventor, German physicist Wilhelm Conrad Röntgen, who refused to cash in on his discovery, eschewing patients, contracts, and licenses in the belief that X-rays should benefit all of humanity.5