Making Robots Give the Right Glances
By mimicking nonverbal actions, robots could become better assistants.
If robots are to become a common sight in homes and public spaces, they will need to respond more intuitively to human actions and behave in ways that are easier for humans to understand. This week, at the 2009 IEEE Human-Robot Interaction (HRI) conference, in La Jolla, CA, researchers will present recent progress toward these twin goals.
Several research teams are exploring ways for robots to both recognize and mimic the subtle, nonverbal side of human communication: eye movements, physical contact, and gestures. Mastering these social subtleties could help machines convey meanings to supplement speech and better respond to human needs and commands. This could be crucial if robots are ever to fulfill their potential as personal assistants, teaching aides, and health-care helpers, say those involved.
Scientists from Carnegie Mellon University will present details of experiments involving a robot that uses eye movement to help guide the flow of a conversation with more than one person. Developed in collaboration with researchers from Japan’s Osaka University and from ATR Intelligent Robotics and Communication Laboratory, this trick could prove particularly useful for robots that act as receptionists in buildings or malls, or as guides for museums or parks, the scientists say.
“The goal is [to] use human communication mechanisms in robots so that humans interpret behaviors correctly and respond to them in an appropriate way,” says Bilge Mutlu, a member of the team from Carnegie Mellon. After all, Mutlu notes, “we don’t want to create an antisocial, shy robot.”
The robot used for the experiments, called Robovie, was developed previously at ATR. To give Robovie the ability to combine gaze with speech, the researchers first developed a model of the way that people use their eyes during a conversation or a discussion. They studied the social-cognition literature to develop predictive models, and then refined these models by collecting data from laboratory observations. Finally, the group incorporated this data into the software that controls Robovie in different conversational settings.
During the experiments, Robovie played the role of a travel agent, greeting participants, introducing itself, and then asking a series of questions to determine where the participants would like to travel. Three conversational scenarios were also tested: addressing one participant while ignoring the other; addressing one participant while acknowledging the other as a bystander with quick glances; and addressing both participants equally, with equal amounts of eye contact.
The team found that Robovie was able to guide the flow of a conversation effectively. Those at whom the robot gazed for longer took more turns speaking, those to whom Robovie sent acknowledging glances spoke less, and those who were ignored completely spoke the least. This pattern was consistent about 97 percent of the time. The researchers say that future work will combine the robot’s gaze with other nonverbal cues, including gestures.
Another team at the conference is focusing on simple physical contact. Using a small, remote-controlled humanoid robot, scientists from the Netherlands conducted an experiment in which they showed volunteers the robot attempting to assist a person using a computer. The volunteers described the robot as less machine-like and more dependable when it proactively offered help and engaged in physical contact with, for instance, a shoulder pat or a high five. “We showed that how behaviors such as proactiveness and touch are combined matters,” says Henriette Cramer, a researcher and PhD candidate at the University of Amsterdam, who will present the findings tomorrow. She says that the goal of her team’s research is to find out when and what kind of physical contact works. “We think touch is an important aspect of interaction and we want to further explore its effects, especially in combination with other social behaviors,” she adds.
“We’re really looking at building into these robots very humanlike social abilities,” says Brian Scassellati, a professor who studies human-robot interaction at Yale University and who is the program co-chair for HRI 2009. The field of human-robot interaction is young but growing rapidly, says Scassellati, and it is revealing much about human social psychology. “It’s only really in the last 10 years or so that we’ve had the computational and perceptual capability on these machines to really make a difference,” he notes.
Couldn't get to Cambridge? We brought EmTech MIT to you!Watch session videos here