TDoctors could look into the human body without making a single incision that seemed like a miracle concept. But medical imaging in radiology has come a long way, and the latest artificial intelligence (AI)-led technology is going much further: harnessing the massive computational capabilities of artificial intelligence and machine learning to scan the body for differences that even the human eye can miss. .
Imaging in medicine now includes sophisticated methods for analyzing each data point to distinguish disease from health and signal from noise. If the first few decades of radiology were all about improving the accuracy of images of the body, the next decades will be devoted to interpreting that data to ensure nothing is overlooked.
Imaging is also evolving from its primary focus – diagnosing medical conditions – to playing an integral role in treatment as well, particularly in the field of cancer. Doctors are starting to rely on imaging to help them monitor tumors and the spread of cancer cells so they have a better and faster way to see if treatments are working. This new role for imaging will change the types of treatments patients will receive, and vastly improve the information doctors get about how well they are working, so they can eventually make better choices about the treatment options they need.
“In the next five years, we’ll see functional imaging become a part of care,” says Dr. Basak Dogan, associate professor of radiology at the University of Texas Southwestern Medical Center. We do not see that the current standard imaging answers the real clinical questions. But functional technologies will be the answer for patients who want higher accuracy in their care so they can make better informed decisions.”
Detect problems earlier
The first hurdle is getting the most out of what the images can do—whether it’s X-rays, computed tomography (CT), magnetic resonance imaging (MRI), or ultrasound—to automate their reading as much as possible, which saves precious time for specialists. X-rays. Computer-assisted algorithms have proven their value in this field, as massive computing power has made it possible to train computers to distinguish between abnormal and normal results. Software specialists and radiologists have been collaborating for years to come up with these formulas; Radiologists feed their findings to computer programs on tens of thousands of normal and abnormal images, which teach the computer to distinguish when the images contain objects that fall outside normal parameters. The more images a computer has to compare and learn from, the better it is to adjust for differences.
For the FDA to approve an algorithm that includes imaging, it must be accurate 80% to 90% of the time. To date, the US Food and Drug Administration has approved about 420 of these drugs for various diseases (mostly cancer). The FDA still requires that a human be the final arbiter of what a machine learning algorithm finds, but these techniques are necessary to flag images that may contain suspicious results for doctors to review — ultimately providing faster answers to patients.
At Mass General Brigham, doctors use about 50 of these algorithms to help them care for patients, ranging from detecting aneurysms and cancers to detecting blockages and signs of stroke among emergency room patients, many of whom will present with the common symptoms these conditions share. About half have been approved by the Food and Drug Administration, and the rest are being tested in patient care.
“The goal is to find things early. If we can use computers to do that, that patient will be cured faster,” says Dr. Keith Dreyer, chief data science officer and deputy head of radiology at Mass General Brigham.
More comprehensive patient tracking
While computer-assisted triage is the first step in integrating AI-based support into medicine, machine learning has also become a powerful way to monitor patients and track even the smallest changes in their conditions. This is particularly critical in cancer, where the daunting task of determining whether a person’s tumor is growing, shrinking, or remaining intact is essential to making decisions about how well treatments will work. “We have a hard time understanding what happens to the tumor while patients are undergoing chemotherapy,” Duggan says. “Unfortunately, our standard imaging techniques can’t detect any change until halfway through chemotherapy” — which can take months in the process — “when some kind of shrinkage begins to occur.”
Imaging can be useful in these situations by capturing changes in tumors that are not related to their size or anatomy. “In the very early stages of chemotherapy, most of the changes in the tumor are not at the level of cell death,” Duggan says. “The changes are associated with modulating the interactions between the body’s immune cells and cancer cells.” And in many cases, the cancer does not shrink in a predictable way from the outside in. Alternatively, pockets of cancer cells within the tumor may die, while others continue to thrive, leaving the overall mass more diffuse, like eating a moth. Jacket. In fact, since some of the cell death is associated with inflammation, tumor size may increase in some cases, although this does not necessarily indicate an increased growth of cancer cells. Currently, standard imaging cannot distinguish how much of the tumor is still alive and how much is dead. The most common imaging techniques for breast cancer, mammography and ultrasound, are instead designed to capture anatomical features.
Read more: The race to create a breast cancer vaccine
At UT Southwestern, Dugan is testing two ways to use imaging to track functional changes in breast cancer patients. In one, using funding from the National Institutes of Health, it images breast cancer patients after one cycle of chemotherapy to capture tiny changes in pressure around the tumor by injecting tiny bubbles of gas. Ultrasound measures changes in the pressure of these bubbles, which tend to accumulate around tumors; Growing cancers have more blood vessels to support their expansion, compared to other tissues.
In another study, Dugan is testing optoacoustic imaging, which converts light into sound signals. The laser is shined onto the breast tissue, causing the cells to sway, producing sound waves that are captured and analyzed. This technique is well suited for detecting oxygen levels in tumors, as cancer cells tend to require more oxygen than normal cells to continue growing. Changes in the sound waves can detect which parts of the tumor are still growing and which parts of the tumor are not. “Just by imaging the tumor, we can tell which ones are more likely to go to the lymph nodes and which aren’t,” Duggan says. Currently, doctors cannot tell which types of cancers will spread to the lymph and which will not. “It could give us information about how the tumor behaves and potentially save patients from unnecessary lymph node surgeries that are now part of standard care.”
This technology can also help find early signs of cancer cells that have spread to other parts of the body, before they appear on visual examination and without the need for invasive biopsies. Focusing on the organs to which cancer cells typically spread, such as the bones, liver and lungs, could give doctors a head start in picking up these new deposits of cancer cells.
Detecting invisible anomalies
With enough data and images, Dreyer says, these algorithms can find aberrations for any situation that no human can detect. His team is also developing an algorithm that measures certain vital signs in the human body, whether anatomical or functional, so that it can identify changes in those metrics that might indicate a person may have a stroke, fracture, heart attack, or otherwise. negative event. This is the holy grail of imaging, says Dreyer, and although it’s still a few years away, “these are the kinds of things that are going to be transformative in AI healthcare.”
To get there, it would require tons and tons of data from hundreds of thousands of patients. But isolated US healthcare systems mean that gathering such information is a challenge. Consolidated learning, in which scientists develop algorithms that are applied to different institutions’ anonymized patient information databases, is one solution. In this way, privacy is preserved and organizations will not have to put their secure systems at risk.
If more of these models are validated, through standardized learning or otherwise, AI-based imaging could start helping patients at home. As COVID-19 has made self-testing and telehealth more routine, people may eventually be able to obtain imaging information through a portable ultrasound provided via a smartphone app, for example.
“The real change in healthcare that will happen from AI is that it will provide a lot of solutions to patients themselves, or before they become sick, so they can stay healthy,” Dreyer says. This is perhaps the most effective way to improve imaging: by enabling patients to learn and make more informed decisions about protecting their health.
More must-read stories from TIME