Skip to Content

Watson Goes to Work in the Hospital

Technology like that inside the Jeopardy! champ is being used to identify when babies are acquiring an infection.
April 13, 2011

Designed to answer Jeopardy! questions, IBM’s Watson is of little use beyond the game show’s set. But some of the techniques that helped the computer defeat two human Jeopardy! champions in February are showing promise in a new context: the hospital. Researchers in Canada are using analytics like those that helped the computer decipher the language of clues to provide an early warning when babies in an intensive care unit acquire a hospital-borne infection.

Data deluge: Streams of medical data from babies in the ICU can provide early warning of an infection.

As you would expect, babies in an ICU are surrounded by equipment that tracks their vital signs, but much of that data is wasted, says Carolyn McGregor, a researcher at the University of Ontario Institute of Technology. “They produce constant streams of data,” she says, “but that information is often distilled down to a [nurse’s] spot reading every 60 minutes, written on paper.”

McGregor leads a project that has developed software to ensure that no scrap of that data goes to waste. At the neonatal ICU of the Hospital for Sick Children in Toronto, that software, dubbed Artemis, collects data from eight infant beds. The system can monitor the baby’s electrocardiogram, heart rate, breathing rate, blood oxygen level, temperature, and blood pressure. It can also access data from medical records, such as the baby’s birth weight. McGregor and colleagues are developing algorithms that use those signals to spot signs of hospital-borne infection before doctors and nurses do.

Current practice used to diagnose infections in the ICU has a high false-positive rate, which means that many babies are misdiagnosed and receive drugs they don’t need, or occupy an ICU bed for longer than necessary. “Babies diagnosed with infection have, on average, a doubling of length of stay,” says McGregor. “We want to reduce that.”

The researchers have already shown that Artemis can use some of the same clinical observations that doctors use to diagnose babies. For example, the system can spot episodes of apnea (a pause in breathing), which is thought to increase in frequency when an infection sets in, says McGregor. Other research has shown that a variation in heart rate can warn of an infection 24 hours before most other symptoms occur. “We have proposed our own algorithm that uses those, and a wider range of data, to detect signs of infection,” says McGregor.

Two slightly different versions of that algorithm are being trialed in the ICU. The results are due to be published later this year. The software’s effectiveness will be judged by comparing its decisions and observations with those made by medical staff. Algorithms that attempt to learn new warning signs of infection are also being tested. “No one has had access to all this data before, so we can’t always refer to past research,” says McGregor.

Artemis is built on an analytics platform called InfoSphere Streams that, like Watson, emerged from IBM research into ways that software can make decisions on the spot using data arriving at a high speed from many different sources.

“The processing paradigms we had before just didn’t fit with the kind of streaming data we are dealing with,” says McGregor. Software has traditionally performed analysis by systematically scouring a fixed, well-organized store of data, like a person navigating the stacks of a library, she explains.

InfoSphere Streams, in contrast, is based around a newer, alternative model known as stream computing. Information constantly flows into the software, where question-answering algorithms act like filters, pulling out answers from the information available at any particular moment.

That makes it possible to take on data that moves too fast to be written to hard disks, which are relatively sluggish, says Lipyeow Lim, a researcher at the University of Hawaii who previously worked at IBM’s TJ Watson laboratory. “As data comes in, you want to look at it only once, then let it go,” he says. InfoSphere Streams provides a kind of operating system for that approach, says Lim, sharing the work of implementing a particular program across many computers so the system as a whole can generate answers without committing any data to disk.

That enables the cluster of computers that make up Artemis to keep up with all the different data sources streaming in for different babies. “Monitoring one baby you could probably do with a traditional system and storage design,” says Lim. “The challenge comes when you want to monitor many of them.”

The same approach enabled Watson to answer questions fast enough to compete with human experts. As soon as it was provided with a new clue, many different natural language processing algorithms set to work in parallel. Their results streamed in to an analytics engine similar to that in InfoSphere Streams which reconciled the different answers and decided on Watson’s best response.

McGregor is taking advantage of Artemis’s capacity for large amounts of data to develop it into a kind of remote diagnosis resource that can serve neonatal ICUs around the world. “We have implemented a cloud version so that a women’s hospital in Rhode Island streams data to my lab over a secure Internet link,” she says. Two hospitals in China will connect their neonatal ICUs using this technology later this year.

Meanwhile, machines that more closely resemble the Watson that wowed Jeopardy! viewers are on their own, slower road to the hospital. IBM has begun collaborating with voice-recognition company Nuance to investigate how a Watson-like system that digests research literature, medical records, and doctor’s notes might advise clinicians.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.