A Room Where Executives Go to Get Help from IBM’s Watson
Researchers at IBM are testing a version of Watson designed to listen and contribute to business meetings.
Software able to understand human interactions could help with business decisions.
Photocopiers, PCs, and video conferencing rooms all rose from being technological novelties to standard tools of corporate life. Researchers at IBM are experimenting with an idea for another: a room where executives can go to talk over business problems with a version of Watson, the computer system that defeated two Jeopardy! champions on TV in 2012.
An early prototype has been made in the Cognitive Environments Lab, which opened last year at IBM’s Thomas J. Watson research center in Yorktown Heights, New York. It is intended to explore how software that can understand and participate in human interactions could “magnify human cognition,” says Dario Gil, director for symbiotic cognitive systems at IBM research.
The lab looks more or less like a normal meeting space, but with a giant display taking up one wall, and an array of microphones installed in the ceiling. Everything said in the room can be instantly transcribed, providing a detailed record of any meeting, and allowing the system to listen out for commands addressed to “Watson.”
Those commands can be simple requests for information of the kind you might type into a search box. But Watson can also take a more active role in a discussion. In a live demonstration, it helped researchers role-playing as executives to generate a short list of companies to acquire.
First, Watson was brought up to speed by being directed, verbally, to read over an internal memo summarizing the company’s strategy for artificial intelligence. It was then asked by one of the researchers to use that knowledge to generate a long list of candidate companies. “Watson, show me companies between $15 million and $60 million in revenue relevant to that strategy,” he said.
After the humans in the room talked over the results Watson displayed on screen, they called out a shorter list for Watson to put in a table with columns for key characteristics. After mulling some more, one of them said: “Watson, make a suggestion.” The system ran a set of decision-making algorithms and bluntly delivered its verdict: “I recommend eliminating Kawasaki Robotics.” When Watson was asked to explain, it simply added. “It is inferior to Cognilytics in every way.”
IBM’s researchers are also considering other ways the technology at work in their current demo might help out in a workplace—for example, by having software log the relative contributions of different people to a discussion, or deliver a kind of fact-checking report after a meeting that highlights mistaken assertions.
By surfacing that kind of information, Watson could change the dynamics of group interactions for the better, says Gil. “Watson could enhance collective intelligence by facilitating turn taking, or having a neutral presence that can help prevent groupthink,” he says. For example, people may feel freer to question their boss’s opinion if Watson is the first to suggest there is another way of looking at a problem.
IBM is not the first to try to improve meetings by having software understand and enhance them. One large project backed by the European Union developed technology that records and summarizes meetings using a combination of speech recognition and sensors that tracked participants’ head movements and gaze for signals of the most useful content.
“Using recognition and content analysis technologies has a significant potential to enhance both face-to-face and remote meetings, and could significantly improve organizational cultures,” says Steve Renals, a professor of speech technology at the University of Edinburgh who helped lead that project.
However, the accuracy of speech transcription remains a challenge to the reliability of such technology, says Renals. Even a person speaking directly into a microphone in a quiet room is unlikely to have all their words transcribed correctly, and meetings come with extra problems such as people talking over one another, echoes, and incidental noises such as tapping pens.
In the demonstration shown to MIT Technology Review, the IBM participants wore microphones to give Watson a clearer signal. But Gil’s team is also working on a system of microphones able to collect sound from multiple very focused—but steerable—directions. It would use information from cameras in the ceiling to lock onto people and get a clear recording of their speech.
Become an MIT Technology Review Insider for in-depth analysis and unparalleled perspective.Subscribe today