Skip to Content

Socially Sensitive AI Software Coaches Call-Center Workers

Customer-service reps are getting real-time coaching from software that has learned to detect problems in a conversation.
January 31, 2017

Next time you call customer support, the person on the other end of the line may be getting a little help from emotionally intelligent AI software.

Some call-center workers are now receiving real-time coaching from software that analyzes their speech and the nature of their dialogue interactions with customers. As they are talking to someone the software might recommend that they talk more slowly or interrupt less often, or warn that the person on the other end of the line seems upset.

This gives us a fascinating glimpse of how AI and humans might increasingly work together in the future. Plenty of routine work is becoming automated in call centers and other back office settings, but real human interaction seems likely to resist automation for a long while yet. Even so, AI software may change the way people interact with customers by serving in an advisory capacity.

The call-center software is supplied by Cogito, a company based in Boston. Its software automatically assesses the dynamics of a conversation, and has been trained to recognize certain pertinent characteristics. Rather than the substance of a conversation, it analyzes the raw audio. “Conversation is like a dance,” says Josh Feast, CEO of Cogito. “You can tell whether people are in sync, and it turns out this is a much better measure than language.”

Call centers have long sought to analyze customers’ voices for signs of agitation or frustration. Software is getting much better at analyzing social interactions and emotions thanks to clever new machine-learning techniques and copious amounts of training data.

Feast says his company has found that call centers do not want to replace phone workers, but they are keen to improve the way they operate. “Humans are social beings,” he says. “We engage with each other for emotional reasons, and we want somebody to help us, to counsel us.”

Feast founded Cogito, in 2007, with Sandy Pentland, a professor in the MIT Media Lab who specializes in studying human dynamics. The company originally developed its technology with funding from DARPA as a way to detect a person’s mental state using his or her speech.

Some companies say Cogito has helped improve the performance of their call center staff. The health-care company Humana developed a tool for call center staff using Cogito’s technology and saw a 28 percent improvement in a commonly used measure of customer satisfaction. Feast says employees working with the software typically report greater job satisfaction, too.

Following such success in customer service, Cogito is working on a platform that could see the technology deployed much more widely. Feast says it could be built into videoconferencing software, or used as an aid during business negotiations. He speculates that it might even help in marriage counseling.

“Providing feedback to individuals on the mental states that they might be inducing in other people seems valuable,” says Peter Robinson, a professor at the University of Cambridge, U.K., who studies human-computer interaction. “There are many other applications for this sort of technology.”

But Robinson says it will be important not to rely on such a system too much. “These social signals are at best ambiguous and at worst distinct for different people,” he says.  

Rosalind Picard, a professor at the MIT Media Lab who has pioneered emotion tracking (see “Thinking About Emotional Machines”), agrees that it can be problematic to develop such technology. “I think it depends how the interface is built,” Picard says. She notes, for instance, that different people often have different conversational styles. “Many New Yorkers practice a ‘high interruption’ style,” Picard says. “Interrupting can thus be likeable and build rapport with them. But the same behavior with some other callers could be seen as rude.”

Keep Reading

Most Popular

This new data poisoning tool lets artists fight back against generative AI

The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models. 

Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist

An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.

Data analytics reveal real business value

Sophisticated analytics tools mine insights from data, optimizing operational processes across the enterprise.

Driving companywide efficiencies with AI

Advanced AI and ML capabilities revolutionize how administrative and operations tasks are done.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.