Researchers at Northeastern University have developed a virtual nurse and exercise coach that are surprisingly likable and effective—even if they’re not quite as affable as the medical hologram on Star Trek. In fact, patients who interacted with a virtual nurse named Elizabeth said they preferred the computer simulation to an actual doctor or nurse because they didn’t feel rushed or talked down to.
A recent clinical trial of the technology found that Elizabeth also appears to have a beneficial effect on care. A month after discharge, people who interacted with the virtual nurse were more likely to know their diagnosis and to make a follow-up appointment with their primary-care doctor. The results of the study are currently under review for publication.
“We try to present something that is not just an information exchange but is a social exchange,” says Timothy Bickmore, associate professor in Northeastern’s College of Computer and Information Science. Bickmore led the research. “It expresses empathy if the patient is having problems, and patients seem to resonate with that.”
Bickmore first became interested in working on “virtual agents” after seeing demonstrations of very early interactive animated characters. “I was amazed at how people were instantly mesmerized by them, and how quickly this effect vanished when the characters did something stupid,” he says. “I was interested in seeing how they could be engineered to maintain the enchantment over long periods of time and be used for practical purposes beyond entertainment.”
He adds that patients with little or no computer experience seem to prefer the virtual person to more standard computer interactions, because it feels more natural.
“Most people get frightened when they hear they are going to get care from a computer, so to hear so clearly that we are not short-changing patients is gratifying,” says Joseph Kvedar, a physician and founder and director of the Center for Connected Health at Partners Healthcare. Kvedar has collaborated with Bickmore in the past.
To develop the computer-controlled avatars, researchers first recorded interactions between patients and nurses. They then tried to emulate the nurses’ nonverbal communication by endowing the virtual character with hand gestures and facial expressions. (The resulting animation is, however, much simpler than today’s sophisticated video games.)
Researchers also add small talk, asking users about local sports teams and the weather, which real nurses and coaches often do to put patients at ease. The verbal interactions are fairly basic; the nurse or trainer has a set repertoire of questions, and users choose from a selection of possible answers. For anything beyond that repertoire, the virtual agent will refer the patient to a human health-care provider.
Adding these apparently simple touches of humanity does appear to influence how people interact with the program. Patients more accurately reported their health information when interacting with the virtual character than they were when filling out a standard electronic questionnaire.
“This was designed from the ground up to be patient-friendly, warm and engaging; it’s not necessarily the most lifelike and real-human-looking representation, but through trial and error, they have found the characteristics that resonate with patients,” says Steven Simon, chief of general internal medicine at the VA Boston Healthcare System. “I think they are just scratching the surface in terms of how it can best be used, such as in patients with chronic conditions, such as asthma and diabetes.”
Such technologies will become increasingly important with rising health-care costs and an aging population. “We already know we don’t have enough health-care providers to go around, and it’s only getting worse,” says Kvedar. “About 60 percent of the cost of delivering health care comes from human resources, so even if you can train more people, it’s not an ideal way to improve costs.”
Kvedar worked with Bickmore on a second, home-based trial, in which a virtual coach called Karen encouraged overweight sedentary adults to exercise. Users checked in with Karen three times a week, and she gave them recommendations and listened to their problems. Over 12 weeks, those who talked to the coach were significantly more active than those who simply had an accelerometer to record how much they walked.
“Older adults seem to be really accepting. They like the social aspect of it,” says Bickmore. “With the home-based agent, I think they would like to chat with them longer than we let them.”
Some users wanted to know more about their virtual coaches, so Bickmore’s team experimented with giving the characters a backstory. They found that participants whose virtual coach told them stories in the first person were more likely to log into the system than those who heard the same stories in the third person.
“They had more frequent conversations with the coach when it was being more human, and they did not report feeling more deceived,” says Bickmore. He adds that when asked, participants do understand the character is virtual, but they say they sometimes forget. “They say they will feel guilty about not logging in, which means they have formed some kind of emotional bond.”
But not everyone responded well to Karen. One of the challenges in broadening the use of this technology will be creating virtual characters that can learn from users and adapt to their preferences.
Bickmore’s team is now working on a virtual nurse that would reside in the hospital room. Patients can talk to it about their hospital experience, report pain levels, and ask questions. The researchers are also integrating sensors into the system, to record when the patient is sleeping, for example, or to track when different doctors enter the room.
In a pilot study, patients had an average of 17 conversations with the nurse per day. “When we interviewed them afterward, we found that the agent seemed to be effective at addressing the loneliness you often feel if you’re at the hospital by yourself,” says Bickmore.
This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
Everything you need to know about artificial wombs
Artificial wombs are nearing human trials. But the goal is to save the littlest preemies, not replace the uterus.
Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist
An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.
Data analytics reveal real business value
Sophisticated analytics tools mine insights from data, optimizing operational processes across the enterprise.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.