Skip to Content
Tech policy

Emotional Chatting Machine Assesses Your Emotion and Copies It

Chatbots have never been able to empathize. That looks set to change, thanks to a Chinese team that has built a chatbot capable of conveying specific emotions.

Chatbots have a long and venerable history dating back to the 1960s and the famous Eliza bot that fooled some people into thinking they were chatting with a real human. Since then, computer programs capable of conducting conversations have become progressively more advanced with greater ability to understand content and respond appropriately.

And yet the ability to reproduce human speech convincingly still eludes chatbots. Talk to one for any length of time and it soon becomes clear that it is a machine.

One reason is that computers are unable to gauge the emotional content of conversations and empathize accordingly. This lack of emotional intelligence inevitably gives chatbots away.

Today, that looks set to change, thanks to the work of Hao Zhou at Tsinghua University in Beijing and a few pals who have developed a chatbot capable of assessing the emotional content of a conversation and responding accordingly.

The work opens the door to a new generation of chatbots that are emotionally aware. “To the best of our knowledge, this is the first work addressing the emotion factor in large-scale conversation generation,” say Hao and co.

Psychologists generally classify emotion into six basic categories: happiness, sadness, disgust, anger, surprise, and fear. We convey these emotions with written speech using words that have a specific emotional valence. The way this valence changes throughout a conversation is a measure of the emotional content.

For example, words such as laugh and smile are associated with happiness, depressing and cry are associated with sadness, and so on. Indeed, psychologists have created huge tables listing the emotional valence of words.

It is this kind of database that certain apps use to determine whether tweets are positive or negative—a technique known as sentiment analysis. In fact, using a sliding-window technique, social scientists can study the way the emotional charge changes throughout entire novels.

Now Hao and his team have used the same kind of technique to analyze and control the emotional content of conversations. This task consists of two parts. The first is to analyze the emotional content of the human user’s conversation using techniques similar to sentiment analysis.

But the next part is more tricky. It involves generating responses that are both relevant and emotionally appropriate.

Hao and company’s method is relatively straightforward. They start with a dataset of 23,000 sentences collected from the Chinese blogging service Weibo and manually annotated with their emotional charge—anger, disgust, happiness, like, sadness, or an additional charge associated with liking something. (They ignore surprise and fear, which are relatively rare.)

The team employs this dataset to train a deep-learning algorithm to classify sentences according to their emotional valence.

Finally, they use an ordinary chatbot conversation generator to produce responses while utilizing the deep-learning algorithm to ensure that the response has the correct emotional content. They call their system the Emotional Chatting Machine.

For example, to reply to the statement “Worst day ever. I arrived late because of the traffic,” the Emotional Chatting Machine can generate different responses, depending on the required emotion.  

For happiness, it responds, “Keep smiling! Things will get better.” For sadness, it responds, “It’s depressing.” For disgust, it says, “Sometimes life just sucks.” For anger, it says, “The traffic is too bad!” And to express liking, it says, “I am always here to support you.”

This is interesting work that could have significant application. The ability to empathize (or seem to empathize) is an important component of human communication. Various studies have shown that humans are much more likely to have a positive reaction to an empathetic conversation. And that would certainly be useful in many call center–type situations.

Ref: arxiv.org/abs/1704.01074: Emotional Chatting Machine: Emotional Conversation Generation with Internal and External Memory

Deep Dive

Tech policy

afghans targeted by biometric data
afghans targeted by biometric data

This is the real story of the Afghan biometric databases abandoned to the Taliban

By capturing 40 pieces of data per person—from iris scans and family links to their favorite fruit—a system meant to cut fraud in the Afghan security forces may actually aid the Taliban.

thermal image of young woman wearing mask
thermal image of young woman wearing mask

The covid tech that is intimately tied to China’s surveillance state

Heat-sensing cameras and face recognition systems may help fight covid-19—but they also make us complicit in the high-tech oppression of Uyghurs.

conceptual illustration showing layers of imagery that reference surveillance, policing, and domestic violence
conceptual illustration showing layers of imagery that reference surveillance, policing, and domestic violence

How Amazon Ring uses domestic violence to market doorbell cameras

Partnerships with law enforcement give smart cameras to the survivors of domestic violence. But who does it really help?

Why you should be more concerned about internet shutdowns

Governments are turning off the internet to silence dissenters at an ‘exponential’ rate—and threatening civil society, says the chief operating officer of Google’s Jigsaw project.

Stay connected

Illustration by Rose WongIllustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.