Skip to Content

Customer Service Chatbots Are About to Become Frighteningly Realistic

A startup gives chatbots and virtual assistants realistic facial expressions and the ability to read yours.
March 22, 2017

Would your banking experience be more satisfying if you could gaze into the eyes of the bank’s customer service chatbot and know it sees you frowning at your overdraft fees? Professor and entrepreneur Mark Sagar thinks so.

Sagar won two Academy Awards for novel digital animation techniques for faces used on movies including Avatar and King Kong. He’s now an associate professor at the University of Auckland, in New Zealand, and CEO of a startup called Soul Machines, which is developing expressive digital faces for customer service chatbots.

He says that will make them more useful and powerful, in the same way that meeting someone in person allows for richer communication than chatting via text. “It’s much easier to interact with a complex system in a face-to-face conversation,” says Sagar.

Soul Machines has already created an assistant avatar called Nadia for the Australian government. It’s voiced by actor Cate Blanchett and powered by IBM’s Watson software. It helps people get information about government services for the disabled. IBM has prototyped another avatar, Rachel, that helps with banking.

Soul Machines’s simulated faces are supposed to make chatbots more relatable.

The movements of Soul Machines’s digital faces are produced by simulating the anatomy and mechanics of muscles and other tissues of the human face. The avatars can read the facial expressions of a person talking to them, using a device’s front-facing camera. Sagar says people talking to something that looks human are more likely to be open about their thoughts and be expressive with their own face, allowing a company to pick up information about what vexes or confuses customers.

The company’s avatars can also be programmed to react to a person’s facial expressions with their own simulated facial movements, in an attempt to create the illusion of empathy.

Other companies have tried detecting people’s emotions by analyzing a person’s voice, words, or expressions. Amazon is exploring the idea as a way to improve its Alexa voice-operated assistant.

Simulating the tissues and muscles of the face allows for a wide variety of realistic facial expressions.

Keep Reading

Most Popular

Rendering of Waterfront Toronto project
Rendering of Waterfront Toronto project

Toronto wants to kill the smart city forever

The city wants to get right what Sidewalk Labs got so wrong.

windows desktop with anime image from Wallpaper Engine
windows desktop with anime image from Wallpaper Engine

Chinese gamers are using a Steam wallpaper app to get porn past the censors

Wallpaper Engine has become a haven for ingenious Chinese users who use it to smuggle adult content as desktop wallpaper. But how long can it last?

Yann LeCun
Yann LeCun

Yann LeCun has a bold new vision for the future of AI

One of the godfathers of deep learning pulls together old ideas to sketch out a fresh path for AI, but raises as many questions as he answers.

Linux hack concept
Linux hack concept

The US military wants to understand the most important software on Earth

Open-source code runs on every computer on the planet—and keeps America’s critical infrastructure going. DARPA is worried about how well it can be trusted

Stay connected

Illustration by Rose WongIllustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.