Mammographer’s Helper — AI Shows Much Promise in Cancer Detection, Treatment
By Beth W. Orenstein
Radiology Today
Vol. 21 No. 2 P. 14
The FDA approved computer-aided detection (CAD) for mammography in 1998. In 2002, the Centers for Medicare & Medicaid Services approved reimbursement for CAD. By 2010, about 74% of all interpretations of mammograms utilized CAD, according to a study in the Journal of American College of Radiology published in March 2018. While conventional CAD programs are now widely used in mammography, CAD has not been shown to significantly improve readers’ diagnostic accuracy, according to a study in Radiology published in November 2019.
Many, however, consider CAD to be a precursor to AI and all that AI includes, such as machine learning, deep learning, and neural networks. AI in breast imaging is in its infancy, but radiologists believe AI is the perfect next step to CAD and can take mammography further than CAD alone. It could get breast imaging over the hump and make screening with mammography not only more available globally but also more accurate now and in the future. Some early research shows AI’s potential roles in mammography to be very promising.
Time Management
Among those who see an important and impressive role for AI in breast imaging is Constance Lehman, MD, PhD, a professor of radiology at Harvard Medical School and director of breast imaging and codirector of the Avon Comprehensive Breast Evaluation Center at Massachusetts General Hospital (MGH) in Boston. Here’s why she’s optimistic: First, Lehman says, AI requires huge, quality data sets. Breast imaging already has them, thanks to the Mammography Quality Standards Act, which originated in 1994; the Society of Breast Imaging; and the large number of radiologists who have agreed to structured reporting and tracking outcomes.
Many breast imagers link to tumor registries, Lehman says, “so we know what the outcomes of our patients are and we are keeping our data up to date.” As a result, the radiology community has built a large number of databases of screening mammograms linked to registries that indicate whether a patient developed breast cancer a year or more after their screening test.
“So, we can leverage the strength of these quantity and quality databases and apply AI tools to the very important questions: Can computers read images? And can they help predict cancers?” Lehman says.
A second reason Lehman believes mammography and AI are a perfect pair is the potential for training AI to read breast images accurately, which would free breast imagers to devote their attention to the cases where their expertise is most needed. It would also allow screening mammography to be performed in the many areas of the world where it’s not currently available due to a lack of readers.
“You may have heard of radiologists who are afraid that they will be replaced by AI,” Lehman says. “I don’t think anything is further from the truth.”
Lehman says only a small portion of her day is devoted to reading batches of screening mammograms. More of her time is spent talking to patients and discussing findings with them and their health care providers. If a computer can be trained to take over the reading part of her job, it would free her “to spend more time with patients and providers and reach out more to women in my community to see that they’re regularly screened.” Globally, Lehman says, many women have not had the ability to have screening mammograms in their communities because of the cost and a lack of radiologists to interpret the exams.
“We know that mammography is the single proven best tool to reduce breast cancer mortality, and yet the vast majority of women in the world don’t have access to it,” Lehman says. “AI has the potential to improve access globally and shift screening mammography back to having a machine that can triage which cases need the attention of humans and which don’t.”
Lehman also sees a role for AI in providing more consistent and accurate interpretations of screening mammography. “While we know mammography is a fantastic tool, we also know there is human variation in how well it performs, and we truly believe that there are AI tools that can decrease variation,” she says. “The breast imaging community is truly excited about that.”
It may also be possible, Lehman says, that AI can be used to detect other potential health issues from the mammogram that the human eye is unable to extract. “Could AI be used to predict heart disease or Alzheimer’s from mammograms and what is shown on the study just outside the breast tissue?” she asks. The possibilities here are exciting, too, Lehman says.
AI and DBT
Digital breast tomosynthesis (DBT) is the new, better mammogram in the United States, says Emily F. Conant, MD, division chief of breast imaging and a professor of radiology at the Hospital of the University of Pennsylvania in Philadelphia. DBT, which is really a “reconstructed 2D quasi 3D”–like image created from reconstructed “slices” of the breast, has been shown to be better at finding cancers and reducing false-positives than mammograms. But the image data set is huge, Conant says. “With each tomosynthesis study, you have many more layers of digital data to go through than in a conventional 2D study.”
Conant sees AI potentially playing a significant role in aiding in the chain of tasks from reconstruction to interpretation of these images. “AI can not only drive the reconstruction of the huge acquisition data sets to provide more detail and higher dimensionality to readers; it can also aid in interpretation and help in decreasing the time it takes to read DBT studies,” she says. In a reader study published July 2019 in Radiology: Artificial Intelligence, Conant and colleagues found that the concurrent use of AI when interpreting DBT studies not only improved radiologists’ diagnostic accuracy but also reduced their reading time by approximately 50%.
While AI applications for DBT are rapidly advancing compared with 2D mammography, Conant warns there are few large, well-annotated data sets for DBT from which robust AI algorithms can be developed. DBT image acquisition methods also vary greatly across vendors, more so than 2D mammographic image acquisitions. “This is a challenge when creating AI algorithms suitable for DBT studies,” she says. A woman’s DBT studies can look very different, depending on which manufacturer’s equipment was used.
“If you had a 2D mammogram by vendor X one year and a 2D mammogram by vendor Y the next, they’d look pretty similar and the data would be pretty similar to analyze. However, if you had a tomosynthesis by vendor X last year and by vendor Y this year, the resulting DBT image set could look very different,” Conant says. “This variation in acquisition presents a challenge to creating robust AI algorithms.”
In addition, one of the keys to finding early breast cancer is the ability to spot subtle changes on a patient’s images from year to year, Conant notes. A challenge for AI is to compare DBT data across time periods regardless of the equipment that was used. “As DBT AI databases increase, AI will be better positioned to address these issues, which will help further improve patient outcomes,” she says.
Breast density is also an important issue in mammography. Breast cancers may be “masked” or hidden by dense breast tissue, making detection difficult, even with DBT, Conant says. Current AI algorithms can help create image sets that enhance lesions and, therefore, help improve cancer detection, she says.
Like Lehman, Conant sees AI eventually becoming the initial reader, especially in remote regions with few breast imagers. “There may be a certain percentage of images that don’t need to be read by a radiologist,” she says. “The so-called standalone performance of an AI algorithm is just that—AI reading the study without a radiologist. AI algorithms that serve as a first read to triage studies and flag exams that are AI-negative could help with workload.”
AI could also be used as the first read to single out complicated or worrisome cases and determine whether they should be moved to the top of the busy radiologists’ reading lists or perhaps flagged for double reading. “We could develop AI workflow algorithms to prioritize such complex cases on our PACS workstations,” Conant says. “Say it’s Monday morning, and I sit down to read cases. I want all my complex cases first so that I have the opportunity to contact those women with worrisome findings to return for additional imaging. AI could triage my cases in a way that not only optimizes the outcomes for the patients but also supports radiologists’ efficiency and preferences.”
Predicting Breast Cancer
Could AI also be used to help identify women whose mammogram may be clear but are likely to develop breast cancer in the future? Yes, hopefully, says Regina Barzilay, PhD, Delta Electronics Professor in the department of electrical engineering and computer science and a member of the Computer Science and Artificial Intelligence Laboratory at the Massachusetts Institute of Technology (MIT) in Cambridge, Massachusetts, who was diagnosed with breast cancer in 2014. Barzilay and her team at MIT have developed algorithms to predict whether a patient is likely to develop breast cancer in the next five years. The team, which included Lehman, developed the algorithms from a detailed database of pathology reports from more than 100,000 MGH patients over three decades. The machines were trained to register subtle changes in tissue and consider other influences including genetics, hormones, lactation, and weight changes, Barzilay says.
“We looked at almost 72,000 images and asked the machine to look at women who developed cancer, not in the first year, second year, or third year, but later, and to identify patterns that would match these outcomes,” she says. “Cancer takes a very long time to develop, and before we can see the cancer, the machine learns how the tissue will generate cancer in the future.”
Barzilay and her colleagues published a paper in April 2019 in the American Journal of Roentgenology that found their deep learning model could asses a patient’s five-year cancer risk on a breast MRI alone. It also showed that their deep learning model was better at predicting breast cancer than the state-of-the-art risk assessment model physicians use and the Tyrer-Cuzick (TC) model. The TC model estimates breast cancer risk on a number of risk factors including age, BMI, history of benign breast conditions and ovarian cancer, and the patient’s use of hormone replacement therapy. Although their research used breast MRI, Barzilay is optimistic that it could work for mammography as well.
Barzilay says her one concern with the research is that its database was of mostly white women, and she says more work has to be done to determine whether AI or deep learning is as accurate for other ethnicities. “We want to collect data from several institutions and study different populations and backgrounds,” she says. Her group is researching this question and will have data to publish in several months.
Conant says her colleagues at Penn are also highly interested in developing algorithms that pull not only from imaging data but also demographic and even genomic data to determine who may be at higher risk to develop breast cancer. “We can pull together data from the patient’s chart like her family cancer history, her age and race, prior breast biopsy history, and even genomic data, if available. We can combine that information with what we call radiomic data—data extracted from multimodality imaging, such as the texture, density, and even vascularity of her breast tissue—to generate estimates of her personal risk,” she says. “This is a very exciting area of AI research that may help better personalize screening algorithms for women over their lifetime.”
Lehman and Conant see the potential of AI and mammography as only the first step. If the combination could be a better predictor of breast cancer than mammography alone, could it not be used to better treat those who eventually develop breast cancer? Breast imagers may eventually be able to use this information to determine who should have annual mammography screening and who should be scheduled for more aggressive screenings, possibly with other modalities such as ultrasound or MRI, Lehman says. “Risk stratification and being able to combine this with radiomics and what we know about the woman is a very exciting area,” she says. After that, should the woman develop breast cancer, the concept could be extended to determine the best treatment for her particular cancer.
AI and mammography could help personalize treatment as well, Conant agrees.
— Beth W. Orenstein, a freelance medical writer from Northampton, Pennsylvania, is a regular contributor to Radiology Today.