AI capable of screening x-rays in ‘seconds’

Stanford researchers have developed an AI algorithm capable of screening X-rays faster than radiologists.
20 December 2018

In some cases, AI was more effective than trained radiologists. Source: Shutterstock

Artificial Intelligence (AI) could be an answer to a global shortage of radiologists, with the technology now able to screen chest X-rays in a matter of seconds for at least 14 pathologies.

In most cases, the new algorithm performs as well as radiologists, according to the Standford-led study.

The AI algorithm, dubbed CheXNeXt, marks a world-first in its ability to simultaneously evaluate X-rays for a multitude of possible maladies and return results that are consistent with the readings of radiologists.

From those 14 pathologies in question, the algorithm reportedly performed just as well for 10 of them, underperformed in three and outdid the experts in one.

“Usually, we see AI algorithms that can detect a brain hemorrhage or a wrist fracture — a very narrow scope for single-use cases,” said Matthew Lungren, MD, MPH, Assistant Professor of Radiology.

“But here we’re talking about 14 different pathologies analyzed simultaneously, and it’s all through one algorithm.”

For Lungren, the goal is to leverage these algorithms to quickly and reliably scan a wide range of image-based medical exams for signs of disease without the backup of professional radiologists.

The technology could be positioned as a high-quality digital consultation tool to resource-deprived regions which do not have access to a radiologist’s expertise.

AI could also play an important role in fully-developed health care systems. The potential to expedite care, empowering primary care doctors to make decisions on X-ray diagnostics without radiologists in attendance, could prove a great advantage.

“We’re seeking opportunities to get our algorithm trained and validated in a variety of settings to explore both its strengths and blind spots,” said graduate student Pranav Rajpurkar, speaking on next developments.

“The algorithm has evaluated over 100,000 X-rays so far, but now we want to know how well it would do if we showed it a million X-rays— and not just from one hospital, but from hospitals around the world.”

Building on the work of a previous iteration and having been in development for over a year, the diagnostic algorithm can now flag 14 ailments, including masses, enlarged hearts, and collapsed lungs.

For 11 of the pathologies, the algorithm made diagnoses that matched the accuracy of radiologists or better.

A ‘mad dash’

According to the announcement, the release of “hundreds of thousands” of X-rays by the National Institutes of Health in the summer of 2017 led to a “mad dash” among computer scientists and radiologists working with AI to deliver the best possible algorithm for chest X-ray diagnostics.

The Stanford team used about 112,000 X-rays to train the algorithm, with three radiologists reviewing a different set of 420 X-rays, one by one, for the 14 pathologies.

Their professional conclusions served as “ground truth” for each scan, eventually used to test how well the algorithm had learned signs of disease in an X-ray, and to see how well it performed against professionals.

“We treated the algorithm like it was a student, and the 420 images were like the final exam,” said Lungren.

To further evaluate the performance of the algorithm, an additional nine radiologists from multiple institutions also took the ‘final exam’. The radiologists took about three hours on average to analyze the images, while the algorithm scanned the X-rays and diagnosed all pathologies in about 90 seconds.

Developing further

Future plans for the technology include in-clinic testing. While the algorithm is not ready for that yet, the hope is that the X-ray reading process for doctors diagnosing urgent care or emergency patients can be expedited.

“I could see this working in a few ways. The algorithm could triage the X-rays, sorting them into prioritized categories for doctors to review, like normal, abnormal or emergent,” Lungren said. Or the algorithm could sit bedside with primary care doctors for on-demand consultation, he added. In this case, the algorithm could step in to help confirm or cast doubt on a diagnosis.

“We should be building AI algorithms to be as good or better than the gold standard of human, expert physicians,” said Lungren.

“Now, I’m not expecting AI to replace radiologists any time soon, but we are not truly pushing the limits of this technology if we’re just aiming to enhance existing radiologist workflows. Instead, we need to be thinking about how far we can push these AI models to improve the lives of patients anywhere in the world.”