Scaling the Learning Curve
By Beth W. Orenstein
Radiology Today
Vol. 24 No. 6 P. 18
Could AI tools be the key to alleviating the burden of reading chest X-rays?
The chest X-ray, used to evaluate the lungs, heart, and chest wall, is already the most commonly performed diagnostic X-ray exam, according to the ACR. and the demand for chest X-rays is on the rise, thanks in part to an aging population and the COVID-19 pandemic. At the same time, all physicians, including radiologists, are in high demand; a report by the Association of American Medical Colleges estimates the country could see a shortage of as many as 124,000 physicians in both primary and specialty care by 2034. AI is seen as at least part of a potential solution to radiology workload and workforce issues. So, it is good news that a number of studies have shown how patient care can benefit from the use of AI with chest radiographs. Some recent research highlights the promising role of AI in chest imaging.
Quick Triage
Every year, more than 7 million people in the United States show up in emergency departments (ED) with acute chest pain— discomfort in the chest or severe pain that spreads to the back, neck, shoulders, arms, or jaw. Three major causes of chest pain can be life-threatening: acute coronary syndrome, pulmonary embolism, and aortic dissection. While fewer than 8% of ED patients are ultimately found to have these major causes, it is critical that ED staff make an accurate, fast diagnosis. Triaging patients, especially in overcrowded EDs and hospitals with few beds to spare, is essential. Adding to the challenge, clinical tests, such as electrocardiograms and blood tests, often have low specificity. And while physicians are likely to order cardiovascular and pulmonary scans, the additional diagnostic imaging often yields negative results.
Radiologists at Massachusetts General Hospital (MGH) in Boston have developed an open-source deep learning (DL) model to identify patients with acute chest pain syndrome who are at risk for 30-day acute coronary syndrome, pulmonary embolism, aortic dissection, or all-cause mortality, based on a chest X-ray. The DL model could help triage ED patients with acute chest pain more efficiently, the researchers say.
For their study, published in January 2023 in Radiology, the researchers used EHRs of patients presenting with acute chest pain syndrome who had a chest X-ray and additional cardiovascular or pulmonary imaging and/or stress tests at MGH or Brigham and Women’s Hospital in Boston between January 2005 and December 2015. Data from more than 50,000 patients (28,755 from MGH and 22,764 from Brigham Women’s) were used to conduct this research.
The DL model was trained on 23,005 patients from MGH to predict a 30-day composite endpoint of acute coronary syndrome, pulmonary embolism, or aortic dissection and all-cause mortality based on chest X-ray images. The DL tool was able to predict adverse outcomes with a high degree of accuracy. It significantly improved predictions of these adverse outcomes beyond age, sex, and conventional clinical markers such as d-dimer blood tests. The model was able to maintain its diagnostic accuracy across age, sex, ethnicity, and race. Using a 99% sensitivity threshold, the model was able to defer additional testing in 14% of patients, compared with 2% when using a model only incorporating age, sex, and biomarker data.
“Our research is a proof-of-concept study to assess whether there is additional information in the chest X-rays that may be used to help triage acute chest pain patients at the ER,” says lead author Márton Kolossváry, MD, PhD. “Our results indicate that deep learning is able to identify patterns on images that provide additive information above and beyond age, sex, and laboratory results. This indicates that in the future, it may be beneficial to train large AI models that use chest X-rays to help clinicians in their decisions regarding these patients. This may speed up the triage process and therefore shorten waiting times in the ER.”
However, Kolossváry cautions that his group’s proof-of-concept study is just the first step in developing AI tools that may help physicians in the ED. “Our model was trained on previously acquired data and validated based on data from another hospital,” he says. “While this provides robust evidence of the clinical efficacy of our model, we still need more information on the generalizability of our results. The current phase of the models should not be used for clinical decision-making.”
At the same time, Kolossváry is hopeful that the open-source model will provide the basis for complex model development that may be approved for clinical decision-making in the future.
Predicting Disease
Current guidelines recommend the use of a risk calculator to estimate the 10-year likelihood of a patient having a cardiovascular event and determine who should be given a statin for primary prevention. Statin medication is recommended for patients with a 10-year cardiovascular risk of 7.5% or higher.
This risk calculator requires nine input variables, including age, sex, race, systolic blood pressure, hypertension, type 2 diabetes, and blood tests. However, these variables often are not available.
Researchers at Cardiovascular Imaging Research Center at MGH and the AI in Medicine program at Brigham and Women’s Hospital developed a DL model that offers a potential solution for population-based opportunistic screening of cardiovascular risk using existing chest X-rays. They presented their findings at RSNA’s annual meeting in November 2022. The study’s lead author, Jakob Weiss, MD, says the model, called CXR CVD-Risk, predicts cardiovascular mortality from a routine upright chest radiograph image as the only input.
The model was developed using data from The Prostate, Lung, Colorectal, and Ovarian Cancer Screening Trial and included more than 140,000 images from more than 40,000 individuals, Weiss says. The researchers tested the model using a second independent cohort of 11,430 outpatients (mean age 60.1 years; 42.9% male) who had a routine outpatient chest X-ray at Mass General Brigham and were potentially eligible for statin therapy.
Of 11,430 patients, 1,096, or 9.6%, suffered a major adverse cardiac event over the median follow-up of 10.3 years. There was a significant association between the risk predicted by the CXR CVD-Risk DL model and observed major cardiac events.
The researchers also compared the prognostic value of the model to the established clinical standard for deciding statin eligibility. This prognostic value could be calculated in only 2,401 patients (21%) due to missing data (eg, blood pressure, cholesterol) in the electronic record. For this subset of patients, the CXR CVD-Risk model performed similarly to the established clinical standard and even provided incremental value, Weiss says.
Weiss says the beauty of the AI model is that it only needs a single chest radiograph image, a common and low-cost medical examination that is often available in the EHR for risk assessment, which makes it a potential tool for opportunistic cardiovascular risk screening if the traditional risk calculator cannot be used due to missing input variables.
Weiss says the researchers do not envision replacing the guideline-recommended risk calculator with the AI model but rather see it as a tool to complement the current approach. “Implementation into clinical workflows could be done with minimal disruption,” Weiss says.
Weiss says his group has shown that a chest X-ray is more than a chest X-ray. However, he adds, future studies should focus on whether AI and chest imaging can actually help improve clinical decision- making and patient outcomes.
Improving Lung Nodule Detection
A lung nodule is the main finding of lung cancer on chest X-rays, but it is well-known that a lung nodule can be missed by readers. AI-based computer-aided detection (AI-CAD) software annotates nodules with heatmaps so that readers can easily identify them, says Jin Mo Goo, MD, PhD, from the department of radiology at Seoul National University Hospital in Korea. Goo and colleagues studied the actual effect that AI has in clinical practice and found that it significantly improved the detection of lung nodules on chest X-rays. They published their findings in April 2023 in Radiology.
The researchers looked at 10,476 patients with an average age of 59, who had undergone chest X-rays at a health screening center between June 2020 and December 2021. Goo says their trial was conducted with a pragmatic approach, which included enrolling almost all participants as they would in a real clinical setting. The patients were randomly divided evenly into two groups: AI or non-AI. The first group’s X-rays were analyzed by radiologists aided by AI, while the second group’s X-rays were interpreted without the AI results.
Lung nodules were identified in 2% of the patients. Analysis showed that the detection rate for actionable lung nodules on chest X-rays was significantly higher when aided by AI (0.59%) than without AI assistance (0.25%). No differences in the false-referral rates between the AI and non-AI interpreted groups were found.
Goo says the study “provided strong evidence that AI could really help in interpreting chest radiography.” This could contribute to identifying chest diseases, especially lung cancer, more effectively at an earlier stage, he says.
The workflow of implementing AICAD in clinical practice is relatively simple, Goo says. However, “Users should understand that there are false positives and false negatives in AI-CAD results.” In addition, AI-CAD was not trained to detect all types of abnormalities and devices, he notes.
AI and Normal Chest X-rays
Yet another study published in March 2023 in Radiology found that an AI tool can accurately identify normal chest X-rays in a clinical setting. This AI tool is approved in Europe for autonomous reporting of normal chest X-rays. The significance of this study, says coauthor Louis Lind Plesner, MD, from the department of radiology at Gentoftle Hospital in Copenhagen, Denmark, is that the AI tool could be used to help offset the heavy workload experienced by radiologists around the globe.
Researchers used a commercially available AI tool to analyze the chest X-rays of 1,529 patients (median age 69 years) from four hospitals in Denmark. The X-rays from January 2020 were of patients from EDs, hospital floors, and outpatient clinics. Two expert thoracic radiologists labeled the chest X-rays as critical, other remarkable, unremarkable, or normal (no abnormalities). A third chest radiologist was used in cases of disagreement. All three radiologists were blinded to the AI tool results. The AI tool labeled the X-rays as high confidence normal or not.
The AI was able to find 28% of the chest X-rays labeled by expert radiologists as normal, with a high sensitivity for abnormal chest X-rays at 99.1%. This means that 28% of the normal chest radiographs would have been safely automated by the autonomous AI tool. This corresponded to 7.8% of the total chest X-ray workload, Plesner says.
“We could not find a single chest X-ray in our database where the algorithm made a major mistake. Furthermore, the AI tool had a sensitivity overall better than the clinical board-certified radiologists,” Plesner says.
The AI tool performed especially well in identifying normal X-rays out of the outpatient group at a rate of nearly 12%. “This suggests that the AI model would perform especially well in outpatient settings with a high prevalence of normal chest X-rays,” Plesner says.
The radiologists missed 39 chest X-rays with potential patient significance while the AI tool missed one. Plesner says the most surprising finding was just how sensitive the AI tool they used was for all kinds of chest disease.
“No system can be perfect, and mistakes are unavoidable, even with human radiologists,” Plesner says. “What we have seen is that the mistakes by the AI were insignificant compared to the radiologists, who had a higher rate of overlooked findings.”
The researchers concluded that the AI tool could safely help eliminate nearly 8% of chest radiography readings.
Is this approach worth it for a 7.8% reduction in readings? “That is an excellent question, and it really depends on the setting,” Plesner says. “I think some factors that play into this are the specific radiologist shortage in individual centers; how many CXRs are usually unreported, as many centers don’t have the capacity to report all CXRs; the AI knowhow; and trust in these systems, but also the patient mix, meaning, for example, in hospital or outpatients. Further, the prevalence of normal CXRs, that is CXRs without any disease findings, varies depending on patient setting. Therefore, the 7.8% reduction may be bigger in other settings. For example, we found an automation rate of nearly 12% in outpatients, and this may get even higher if patients are younger and less ill in general, such as in a smaller private radiologist practice serving general practitioners as its main group.”
However, Plesner says, “We don’t yet know how the removal of normal chest X-rays would affect radiologists in daily clinic.” Radiologists reading 150 to 200 chest X-rays a day may appreciate some “normals” because they are usually very quick to report and provide somewhat of a break. “When these are removed, or maybe around 30% of them at least, then this may give a sense of the work being more burdensome, even though there are fewer CXRs to begin with,” Plesner says. “This is not to say we should not use this tool, but we should be aware of this possibility as well when implementing it.”
Plesner adds that the AI tool is built upon several DL models capable of performing, for example, detection of individual specific diseases as well. “In this may lie several potential benefits, including prioritization of radiologist workload, for example, if there is a limited time to report CXRs on any given day. Then the AI can not only remove the normal but also prioritize the remaining so that the chest X-rays with the most important findings for the patient can be reported first.”
— Beth W. Orenstein of Northampton, Pennsylvania, is a freelance medical writer and regular contributor to Radiology Today.