IBM Wants Watson to Teach Robots Some Social Skills
IBM is using some of the artificial-intelligence techniques that emerged from its Watson project to teach robots to better understand and mimic human communication.
During a keynote speech at a conference called RoboBusiness held in San Jose, California, this week, Robert High, chief technology officer of Watson at IBM, demonstrated some of the techniques his team is working on using a small humanoid robot.
The robot, a Nao model from the company Aldebaran, spoke with realistic intonation and made appropriate hand gestures during a conversation with High. It even exhibited a little impatience and sarcasm, miming looking at its watch, for example, when asking High to hurry up with his talk.
Speaking with MIT Technology Review after the demo, High admitted that this interaction was prerecorded, because the system doesn’t always work well in noisy environments. But he said the capabilities demonstrated reflected real research. His team is using machine-learning algorithms to learn from video footage to associate appropriate gestures and intonations with different phrases. High says it is important to do this because language alone is only part of human communications.
“We augment the words with physical gestures to clarify this and that,” High said. “You can bring into the [robot] interface this gesturing, this body language, the eye movement, the subtle cues that we as humans use when we communicate with one another to reinforce our understand of what we’re expressing.”
Robot interaction is becoming an important issue as industrial robots start moving into new settings, requiring them to work alongside people, and as companies try to develop robots for use in stores, offices, and even the home.
High added that the project was still at an early stage. “We’re still experimenting, seeing what’s doable and what’s useful, and what will have economic interest,” he said.
IBM used a range of artificial-intelligence techniques to develop Watson, which proved capable of winning the game show Jeopardy! after mining vast quantities of information and extracting meaning from the text (see “A Worthwhile Contest for Artificial Intelligence”).
That effort has evolved into a much broader project involving many more AI approaches under the umbrella term “cognitive computing” (see “Watson Adds Deep Learning to Its Repertoire”). IBM has made many of the machine-learning capabilities developed through this program available to developers through an online application-programming interface.
Some robot makers are already testing these APIs as a way give their products the ability to answer queries spoken in different ways, and to look up useful information (see “A Japanese Robot Is Learning the American Way”).
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
ChatGPT is going to change education, not destroy it
The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.