Watson Goes to the Hospital
Can IBM’s Jeopardy winner help doctors treat their patients?
Last week, IBM’s Watson computer beat two human competitors on Jeopardy. Before the contest was even over, IBM and Nuance, a leading maker of voice-recognition software, announced plans to put Watson to work in the health-care industry.
The idea is for Watson to digest huge quantities of medical information and deliver useful real-time information to physicians, perhaps eventually in response to voice questions. If successful, the system could help medical experts diagnose conditions or create a treatment plan. But it could prove a far more challenging trick than winning a game show.
“The medical domain doubles in knowledge every few years,” said Janet Dillione, executive vice president and general manager of the health-care division of Nuance. “No human brain can possibly retain all the information that’s out there.”
Dillione says that while other health-care technology can work with huge pools of data, Watson is the first system capable of usefully harnessing the vast amounts of medical information that exists in the form of natural language text—medical papers, records, and notes. Nuance hopes to roll out the first commercial system based on Watson technology within two years, although it has not said how sophisticated this system will be.
Watson holds 200 million pages of unstructured data, including some medical information. But the first part of a new IBM-Nuance research project, which is taking place at the University of Maryland and Columbia University, will be determining what other information Watson needs to know. Even then, it will be tricky to present that information in the right format. For the Jeopardy challenge, Watson was fed precategorized and tagged data. The medical literature, in contrast, consists of terabytes of highly specialized and unstructured data.
“Clinical text is often ungrammatical, rich in ambiguous acronyms and abbreviations, misspellings, and sometimes written to resemble bullet lists or tables, especially when directly typed in by health-care providers,” says Stephane Meystre, an assistant professor of biomedical informatics at the University of Utah.
Having Watson listen to the dialogue between a doctor and his or her patient would be very hard, as the dialogue is usually free-form and conversational. Meystre says the main challenge in natural language processing in a clinical setting is the need for very high accuracy and speed—Watson can handle the speed, but problems with accuracy could lead to serious problems, including legal liability.
Physicians and nurses would also need to be trained to use the technology in their work. They would normally expect long descriptive answers to medical queries, not the short succinct ones Watson gave on Jeopardy, says Rohit Kate, a professor of informatics and computer sciences at the University of Wisconsin, Milwaukee. “Physicians and nurses may not be interested in just the answer but also some reasoning or justification behind arriving at it, otherwise they will be reluctant to use the answer by itself for something as critical as their patients’ health.” Watson would need to justify the answer and cite sources, according to Kate. Ideally, the system would even be able to clarify an answer by talking to a physician directly.
“Not all may be technologically savvy enough to be comfortable using such a system,” says Kate. “Some may have qualms about trusting a computer and a few may even feel threatened that their expertise is being replaced by a machine.” Kate predicts that it will take at least a decade before computers can converse with, and work alongside, physicians and nurses.
However, some technology experts foresee a more immediate use for Watson. Michael Swiernik, director of medical informatics at UCLA, says he could imagine the technology being used to process calls at health centers, making more useful information available to patients 24 hours a day.
Hear more about artificial intelligence at EmTech MIT 2017.Register now