By Megan Headley
Establishing gestational age with ultrasound early in pregnancy is a critical step for informing obstetrical care. An accurate gestational age allows providers to identify abnormalities around fetal growth, plan referrals, and decide if or when interventions are necessary for fetal benefit. Yet, as a study from researchers with the University of North Carolina (UNC) points out, the high cost of ultrasound equipment and the need for trained sonographers limits the technology’s use in low-resource settings. Artificial intelligence (AI) could be poised to change that.
The study, “AI Estimation of Gestational Age from Blind Ultrasound Sweeps in Low-Resource Settings,” funded by the Bill and Melinda Gates Foundation and published in March in NEJM Evidence, demonstrates how AI technology can empower nurse midwives to perform ultrasound scans at the level of trained sonographers. Dr. John Martin, chief medical officer for Butterfly Network, a manufacturer of portable ultrasound probes, sees the study’s success as a first step toward expanding obstetrical ultrasound access in low-income and rural settings, among other broad applications.
Central to the UNC study’s success were cost-effective, easily portable ultrasound machines that were deployed in both North Carolina and Zambia. As Martin explains, the cost of equipment has been a significant hurdle to widespread use of ultrasounds in developing areas, but so too has been a lack of reliable electricity. Inexpensive, battery-operated solutions like Butterfly Network’s Butterfly iQ+ device help address these issues. “The last hurdle,” Martin says, “is the expertise of people to actually do the sonogram so that they can get that meaningful information back to make clinical decisions.”
Addressing that last hurdle was the goal of the partnership between the Gates Foundation, UNC, and manufacturers like Butterfly Network. This partnership supports the Fetal Age Machine Learning Initiative, an ongoing project to develop technologies that expand obstetrical ultrasound access to settings where cost and logistics have traditionally prevented its use. Toward this end, the team behind Butterfly was charged with developing “a novel AI application that would essentially drive the expertise of the sonographer into the device itself,” Martin explains.
With most other AI tools, Martin notes, the user can simply point a probe at a relatively static organ and capture an image. “The problem with maternal fetal imaging is the baby’s moving, so you don’t have the advantage of really focusing in on an image and capturing that data,” he says.
To address this challenge, the UNC researchers worked with Butterfly Network’s engineers to develop a blind sweep approach. With this technique, the ultrasound operators move the probe back and forth across the abdomen, three times horizontally then three times vertically, to essentially cover the entire abdomen. Enough information is extracted with a blind sweep for the AI tool to algorithmically determine the gestational age.
To test the effectiveness of the available technology and the blind sweep approach against conventional fetal measurements, the UNC researchers examined 4,695 pregnant volunteers from North Carolina and Zambia between September 2018 through June 2021. The researchers concluded that the AI model was able to estimate gestational age with accuracy similar to that of trained sonographers conducting standard fetal biometry.
This study shows that Butterfly Network is progressing toward its vision of allowing novice or trainee users to deploy imaging tools at the same level of expertise as someone with experience in this field, says Martin. While the potential improvement in maternal care for developing nations is strong—and growing stronger, with the Gates Foundation and Butterfly Network distributing 1,000 Butterfly iQ+ devices to healthcare workers in sub-Saharan Africa—Martin points to the ample ways AI imaging can drive better care.
“If you think about the way medicine is practiced today, it’s ‘get history, physical exam, and pause.’ It’s like that every time you’ve ever gone to the doctor, and it’s been the same in medicine forever,” Martin says. “The problem is that the ‘pause’ between the exam and the decision to add imaging is a significant barrier to what we’re trying to accomplish. … And the data suggests that most of the time, simple imaging answers clinical questions when we’re not sure what’s going on.”
Portable ultrasound devices are making imaging more accessible in a number of ways. Among other examples, portable ultrasound supported prenatal care during the onset of the COVID-19 pandemic, when physicians with Baylor College of Medicine in Houston, Texas, held a drive-through prenatal clinic. Pregnant women were able to remain in their cars while being assessed by a healthcare professional, reducing the potential for patient or staff exposure to COVID-19 while ensuring consistent care.
The ease of deploying portable, AI-driven ultrasound also carries promise for supporting obstetric care in rural environments. However, Martin notes that these applications also open up questions about responsible use. Simply handing out ultrasound equipment reopens the risk of “entertainment ultrasounds,” which the FDA warned against in 2015 as imaging facilities began offering keepsake ultrasound images for mothers-to-be. However, Martin suggests that in the future, doctors may be able to turn the technology on and off remotely for use as needed to support care in high-risk pregnancies.
“As we look forward, we look for those clinical conditions and where the impact can be most significant,” Martin says. “Then, how can we accelerate this impact with tools to get people to competency much faster or drive the intelligence into the device?”