Out of This World
By Beth W. Orenstein
Radiology Today
Vol. 22 No. 8 P. 22
Israeli Company Conducting Cardiac Ultrasound Study Onboard the International Space Station
Researchers have discovered that space travel impacts the human heart. After astronaut Scott Kelly spent nearly a year aboard the International Space Station from March 27, 2015, to March 1, 2016, scientists at the University of Texas Southwestern found that his heart mass had shrunk to 4.9 ounces from 6.7 ounces—about 27%. They reported their findings in the American Heart Association journal Circulation in March of this year.
With billionaires traveling in space—Richard Branson and Jeff Bezos raced to the edge of space earlier this summer in capsules their respective companies built—and space tourism seemingly poised to take off—the first entire group of nonastronauts blasted off on an orbital mission known as Inspiration4 aboard a SpaceX rocket in mid-September and stayed in space for three days—worldwide interest in the effect of spending time in space on the cardiovascular system is growing.
Then there’s this: Ultrasound, which can provide cardiac diagnostic information and ongoing cardiac monitoring, has become more portable and more compact over the last few decades. Whereas conducting CT or MRI in outer space isn’t likely to ever be practical, ultrasound has become so. The one hurdle to sending ultrasound into space to help study the impact of space travel on the cardiovascular system is that ultrasound has been largely operator dependent and so takes some skill to acquire useful images.
Enter UltraSight, an Israeli-based digital health company, which has developed a software platform that uses AI to allow anyone to conduct sonography at the point-of-care with handheld ultrasound devices. (UltraSight recently changed its name from OnSight Medical to emphasize its singular focus on ultrasound technology.)
When the Ramon Foundation and Israel Space Agency put out a call for research studies aboard its upcoming “Rakia” space mission, UltraSight applied to use its AI platform to prove that, with its real-time guidance ultrasound, any astronaut can acquire quality cardiac images, simply and independently of ground mission control or trained medical professionals, says Davidi Vortman, the company’s CEO. UltraSight’s proposal was one of 44 that were accepted to be carried out during the commercial space mission. Rakia, which will launch in early 2022, will be the first mission to the International Space Station manned entirely by private astronauts. The mission was announced in November 2020.
Heart Monitor
Vortman says his company considers space travelers staying heart healthy as a major challenge because zero gravity can impact cardiac function significantly. “The goal of our study is to monitor the changes in the heart during space travel,” he says, and to use that information to help prepare travelers for their trip and remain healthy while in space and once they return.
Another goal is to show that advances in ultrasound technology may hold the key. “We looked at the space mission as an opportunity to show that ultrasound is becoming more mobile and robust,” Vortman says. He is hoping that the results will show that UltraSight’s technology can be used to increase patient access to cardiac imaging and better care back on Earth.
During the eight-day Rakia mission, astronaut Col. (res.) Eytan Stibbe, a former Israeli fighter pilot, will operate the handheld ultrasound device and acquire cardiac images of his colleagues using UltraSight’s real-time guidance. He will use the device three times during the mission: 24 hours after the spaceship launches, once while at the space station, and finally when the crew returns to Earth. Three astronauts will be aboard the spaceship.
Stibbe, the second Israeli in space, will come to the US NASA Space Center in Houston for a day’s training in using UltraSight before the launch. “It will be the first time he’s using that,” Vortman says. Vortman has no doubt that, with the training and the system’s real-time guidance, the images that Stibbe acquires will be useful for monitoring changes in heart anatomy during long stays in a microgravity environment.
“Using our AI technology, it’s going to be practical for Stibbe to perform the ultrasound in space,” Vortman says. The ultrasound system integrates with point-of-care ultrasound devices that are already being used in emergency departments and inpatient service lines today; The software runs on a standard tablet, on top of the ultrasound application screen. It has a video game–like guidance system that tells the operator—in this case, Stibbe—how to hold and handle the ultrasound probe and where to place it for the best diagnostic-ready images.
Once it is turned on, a blue cross and a gray cross appear on the tablet’s screen. The operator moves the probe until the blue cross is directly over the gray cross. When the blue cross turns green, the operator knows it’s the best time to snap a picture. “It assures you that it’s the best possible image you can get,” he says.
Vortman says that even with trained sonographers, some 20% of cardiac ultrasound images are of “suboptimal” quality, which means they cannot be used to make decisions. “This system improves that and represents a major change in the industry,” he says.
Benefits of Sharing
Sharing the images with those who will interpret them is a separate task. The interpretation of the images that Stibbe acquires will be done back on Earth at New York University (NYU) Grossman School of Medicine. UltraSight had worked with researchers at NYU to develop its AI software platform. NYU provided around 500,000 3D cardiac ultrasound images to build the platform from the more than 3 million images in its database. UltraSight continues to add ultrasound images because the more hearts that are scanned, the “smarter” the AI becomes in providing guidance for the best picture, Vortman notes.
The idea for an AI-guided ultrasound device comes from the Israeli-born Achi Ludomirsky, MD, a professor in the department of pediatrics at NYU Grossman School of Medicine. Ludomirsky had been distraught when several of his young patients died from sudden heart attacks suffered on the football field. Ludomirsky knew that, using ultrasound, their heart arrhythmias could have been detected earlier and possibly prevented their deaths. So he approached the multidisciplinary research Weizmann Institute of Science in Rehovot, Israel, with his idea and partnered with Associate Professor Yaron Lipman, PhD, and Itay Kezurer, UltraSight’s cofounders. Ludomirsky is now UltraSight’s CMO and Itay Kezurer its chief technology officer.
One of the challenges faced in planning research for the space mission was developing AI software that worked without connectivity to a network or cloud-based resources, Vortman says. “We were able to make it work with a wireless probe,” Vortman says. That step was important because “you don’t want to use wires in space,” he adds. UltraSight also had to make sure its AI platform would work with the minimal resources that are available in space.
Vortman is confident that all systems are go and that, in addition to acquiring useful information about the effects of space travel on the human heart, the project will prove that “you don’t have to be a cardiologist or a sonographer to acquire high-quality cardiac ultrasound images.” The platform can be used with most major device manufacturer’s equipment.
Currently, most echocardiography is done in echocardiography labs. But many people around the world do not have access to such labs, especially those in rural areas and in underdeveloped and poorer countries. If people without medical training could use automated guidance systems like the UltraSight, it could bring the benefits of cardiac imaging to more people, Vortman says.
The idea of having people who haven’t undergone extensive training in ultrasound techniques use the device—even with its automated guidance—is not without controversy. Some believe that it can work well enough in certain settings, while others are skeptical.
Mindy M. Horrow, MD, FACR, FSRU, president of the Society of Radiologists in Ultrasound, vice chair of the department of radiology at Einstein Healthcare Network, and a professor of radiology at the Sidney Kimmel Medical School in Philadelphia, finds the UltraSight experiment aboard the spaceship intriguing. As long as those who are involved are not responsible for interpreting the images, it has potential, she says. “It sounds like a cool, interesting idea to gain anatomical information about cardiac changes in space,” she says. “Could this be helpful? Sure, it could be.”
Horrow says that she often is sent video clips that were obtained by an offsite sonographer and “that allows me, as if I were standing there, to look at the clips and make my own decisions.” Assuming that the ultrasound device is set up appropriately, Horrow believes the astronaut will be able to take images that would be useful to the radiologists reading them on the ground.
If, during the Rakia mission, UltraSight is able to show its AI platform performs well, it could lead to Earthly applications, Horrow says. For example, she says, pregnant women in rural areas who need sonograms to monitor fetal heart health might be able to receive care closer to home with such a system. “There are plenty of places in the world where there are midwives who could use these devices to supplement their care,” Horrow says.
A Good Problem
Laurance Grossman, MD, a radiologist at the Cleveland Clinic in Ohio, who serves on UltraSight’s medical advisory board, believes that AI ultrasound devices will expand the use of ultrasound without limiting the role of medical professionals who practice the imaging technique. “I don’t believe it’s going to replace our certified sonographers,” he says. “Working in tertiary centers almost my entire career, high-level ultrasound is vital. That’s never going to go away. I see AI as sort of like a Waze for cars. When I used to do mammography, we used computer-aided detection (CAD) as a guide to determine whether we were missing something and make sure we were not missing something. CAD has gotten better and better, but it didn’t replace anything. I see this as expanding ultrasound to areas where they don’t have access to certified trained sonographers. It’s going to expand the field of ultrasonography.”
The AI device might create what could be considered a “good problem,” more reading work, Grossman says. “We are going to have to do a lot more readings, and we are going to be looking at everything that it may point out, making our job as interpreters busier. But I don’t think it will just be an adjunct to what they are already doing in the medical center and everywhere else. It’s also going to provide help in areas where we never had that help before.”
Levon Nazarian, MD, FAUIM, FACR, president of the American Institute of Ultrasound in Medicine and a professor of radiology and vice chair for education at Thomas Jefferson University in Philadelphia, is more cautious about the current state of AI-guided ultrasound devices compared with some of his colleagues. “How can ultrasound be done in a way that is still providing a quality service to patients when you have people holding the probe who may not have had proper training?” he asks.
The best the astronaut aboard Rakia can do is “act as a technician,” Nazarian says. Yes, Nazarian says, it would be wonderful to be able to send someone in Uganda, armed with a small ultrasound machine, out into the field to diagnose a heart problem, but that person would have to be trained in the rudiments of the technology and able to be supervised at a distance for it to work reasonably well.
Everything done in imaging is operator dependent, Nazarian says. “The question is how can you take someone who has never been trained and make them good enough to take images that can be used to make diagnoses?” Nazarian isn’t convinced that AI technology is mature enough currently to be the answer. “AI sounds great,” he says, “but I want to see details.”
If the astronaut aboard Rakia succeeds in cardiac ultrasound imaging, “all the power to them,” Nazarian says. “But I think that for most of the vast array of ultrasound applications, we’re talking at least years away, if not a decade away, from becoming a viable reality.”
— Beth W. Orenstein of Northampton, Pennsylvania, is a freelance medical writer and regular contributor to Radiology Today.