Intelligent Machines

Customer Service Chatbots Are About to Become Frighteningly Realistic

A startup gives chatbots and virtual assistants realistic facial expressions and the ability to read yours.

Soul Machines made this chatbot for the Australian government to help people get information about disability services.

Would your banking experience be more satisfying if you could gaze into the eyes of the bank’s customer service chatbot and know it sees you frowning at your overdraft fees? Professor and entrepreneur Mark Sagar thinks so.

Sagar won two Academy Awards for novel digital animation techniques for faces used on movies including Avatar and King Kong. He’s now an associate professor at the University of Auckland, in New Zealand, and CEO of a startup called Soul Machines, which is developing expressive digital faces for customer service chatbots.

He says that will make them more useful and powerful, in the same way that meeting someone in person allows for richer communication than chatting via text. “It’s much easier to interact with a complex system in a face-to-face conversation,” says Sagar.

Soul Machines has already created an assistant avatar called Nadia for the Australian government. It’s voiced by actor Cate Blanchett and powered by IBM’s Watson software. It helps people get information about government services for the disabled. IBM has prototyped another avatar, Rachel, that helps with banking.

Soul Machines’s simulated faces are supposed to make chatbots more relatable.

The movements of Soul Machines’s digital faces are produced by simulating the anatomy and mechanics of muscles and other tissues of the human face. The avatars can read the facial expressions of a person talking to them, using a device’s front-facing camera. Sagar says people talking to something that looks human are more likely to be open about their thoughts and be expressive with their own face, allowing a company to pick up information about what vexes or confuses customers.

The company’s avatars can also be programmed to react to a person’s facial expressions with their own simulated facial movements, in an attempt to create the illusion of empathy.

Would you find it easier to relate to chatbots with faces?

Tell us what you think.

Other companies have tried detecting people’s emotions by analyzing a person’s voice, words, or expressions. Amazon is exploring the idea as a way to improve its Alexa voice-operated assistant.

Simulating the tissues and muscles of the face allows for a wide variety of realistic facial expressions.

Want to go ad free? No ad blockers needed.

Become an Insider
Already an Insider? Log in.

Uh oh–you've read all of your free articles for this month.

Insider Premium
$179.95/yr US PRICE

More from Intelligent Machines

Artificial intelligence and robots are transforming how we work and live.

Want more award-winning journalism? Subscribe to Insider Plus.
  • Insider Plus {! insider.prices.plus !}*

    {! insider.display.menuOptionsLabel !}

    Everything included in Insider Basic, plus ad-free web experience, select discounts to partner offerings and MIT Technology Review events

    See details+

    What's Included

    Bimonthly home delivery and unlimited 24/7 access to MIT Technology Review’s website.

    The Download. Our daily newsletter of what's important in technology and innovation.

    Access to the Magazine archive. Over 24,000 articles going back to 1899 at your fingertips.

    Special Discounts to select partner offerings

    Discount to MIT Technology Review events

    Ad-free web experience

/
You've read all of your free articles this month. This is your last free article this month. You've read of free articles this month. or  for unlimited online access.