Skip to Content
Biotechnology and health

These AI-powered apps can hear the cause of a cough

Smartphone apps can differentiate between tuberculosis and other respiratory conditions. It’s part of an AI-driven trend: using sound to diagnose illnesses.

a doctor holds up a phone toward a patient coughing. A sound wave is propagating in the background
Stephanie Arnett/MITTR | Getty, iStock

This article first appeared in The Checkup, MIT Technology Review's weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.

This week I came across a paper that uses AI in a way that I hadn’t heard of before. Researchers developed a smartphone app that can distinguish tuberculosis from other diseases by the sound of the patient’s cough.

The method isn’t foolproof. The app failed to detect TB in about 30% of people who actually had the disease. But it’s simpler and vastly cheaper than collecting phlegm to look for the bacterium that causes the disease, the gold-standard method for diagnosing TB. So it could prove especially useful in low-income countries as a screening tool, helping to catch cases and interrupting transmission.

In the new study, a team of researchers from the US and Kenya trained and tested their smartphone-based diagnostic tool on recordings of coughs collected in a Kenyan health-care center—about 33,000 spontaneous coughs and 1,200 forced coughs from 149 people with TB and 46 people with other respiratory conditions. The app’s performance wasn’t good enough to replace traditional diagnostics. But it could be used as an additional screening tool. The sooner people with active cases of TB are identified and receive treatment, the less likely they will be to spread the disease. 

This new paper is one of dozens that have come out in recent years that aim to use coughs and other body sounds as “acoustic biomarkers”—sounds that indicate changes in health.  The concept has been around for at least three decades, but in the past five years, the field has exploded. What changed, says Yael Bensoussan, a laryngologist at the University of South Florida, is the growing use of AI: “With artificial intelligence, you can analyze a larger quantity of data faster.”

Covid also helped drive interest in cough analysis. The pandemic gave rise to 30 or 40 startups focusing on the acoustics of cough, Bensoussan says. AudibleHealthAI launched in 2020 and began working on a mobile app designed to diagnose covid. The software, called AudibleHealth DX, is currently being reviewed by the FDA. And now the company is now branching out to influenza and TB.

The Australian company ResApp Health has been working on acoustic diagnosis of respiratory diseases since 2014, well before the pandemic. But when covid emerged, the company pivoted and developed an audio-based covid-19 screening test. In 2022, the company announced that the tool correctly identified 92% of positive covid cases just from the sound of a patient’s cough.  Soon after, Pfizer paid $179 million to acquire ResApp.

Bensoussan is skeptical that these kinds of apps will become reliable diagnostics. But she says apps that detect coughs—any coughs—could prove to be  valuable health tools even if they can’t pinpoint the cause. Coughs are especially easy for smartphones to capture. “It’s a sea change to have a common device, the smartphone, which everyone has sitting by their bedside or in their pocket to help observe your coughs,” Jamie Rogers, product manager at Google Health, told Time magazine. Google’s newest Pixel phones have cough and snore detection available.

Bensoussan also thinks cough-tracking apps could be game-changers for clinical trials where coughs are one of the things researchers are trying to measure. “It’s really hard to track cough,” she says. Researchers often rely on patients’ recall of their coughing. But an app would be far more accurate. “It’s really easy to capture the frequency of cough from a tech perspective,” she says. 

And it’s not just coughs that can reveal clues about our health status. Bensoussan is leading a $14 million project funded by the NIH to develop a massive database of voice, cough, and respiratory sounds to aid in the development of tools to diagnose cancers, respiratory illnesses, neurological and mood disorders, speech disorders, and more. The database captures a wide variety of sounds—coughing, reading sentences or vowel sounds, inhaling, exhaling, and more. 

“One of the big limitations is that a lot of these studies have private data sets that are secret,” Bensoussan says. That makes it difficult to validate the research. The database that she and her colleagues are developing will be publicly available. She expects the first data release to happen before June.

As more data becomes available, expect to see even more apps that can help alert us to health problems on the basis of cough or speech patterns. It’s too soon to say whether those apps will make a significant difference in diagnosis or screening,  but we’ll keep an ear out for any new developments.  

Read more from MIT Technology Review’s archive

Vocal cues could provide a way to diagnose PTSD, traumatic brain injuries, mood disorders, and even heart disease, Emily Mullin wrote in this story from 2017. 

AI tools might perform well  in the lab but falter in the chaos of the real world. Will Douglas Heaven unpacked what happened when Google Health implemented a tool in Thailand to screen people for an eye condition linked to diabetes. 

In a previous issue of The Checkup, Jessica Hamelzou outlined why we shouldn’t let AI make all our health-care decisions: “Doctors may be inclined to trust AI at the expense of a patient’s own lived experiences, as well as their own clinical judgment.” 

From around the web

Safe bathrooms equipped with motion sensors have eliminated overdose deaths at a Boston clinic that serves unhoused individuals in the city’s infamous “methadone mile”—further proof that supervised consumption sites would save lives. (STAT)

Now that we’ve got new blockbuster weight-loss drugs, some companies are looking to develop longer-lasting treatments and preventatives. But some say an obesity-free future won’t come from pharma. “We are not going to be able to treat our way out of this problem, or medicalize our way out of this problem,” says William Dietz, director of the Global Center for Prevention and Wellness at George Washington University. “What we need to do is to come to terms with the kind of environmental forces which are driving obesity, and generate the political will necessary to address those factors.” (STAT

Advances in neuroscience have sparked worries that brain-computer interfaces might someday read people’s minds or hamper free will. Now “neurorights” advocates are racing against the clock to push for laws that would protect against the misuse and abuse of neurotechnology. (Undark)

Deep Dive

Biotechnology and health

What’s next for bird flu vaccines

If we want our vaccine production process to be more robust and faster, we’ll have to stop relying on chicken eggs.

The messy quest to replace drugs with electricity

“Electroceuticals” promised the post-pharma future for medicine. But the exclusive focus on the nervous system is seeming less and less warranted.

People can move this bionic leg just by thinking about it

A mind-controlled prosthetic feels more like a part of the wearer’s body and promises to make walking easier.

IVF alone can’t save us from a looming fertility crisis

Family-friendly policies and gender equality might be more helpful than technology

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.