Skip to Content
Uncategorized

Brain Scans Teach Humans to Empathize with Bots

Mirror neurons light up when we’re put in their shoes.

When we watch a human express a powerful emotion - anger, fear, disgust - big sections of our brains light up, including so-called “mirror neurons,” which are unique because they fire both when we produce a given action and when we perceive it in others. They are the basis of what neuroscientists call Resonance.

Resonance describes the mechanism by which the neural substrates involved in the internal representation of actions, as well as emotions and sensations, are also recruited when perceiving another individual experiencing the same action, emotion or sensation.

In order to test whether the sections of the brain that are activated when a human sees a robot expressing powerful emotions are the same as when a human sees another human expressing them, an international group of researchers stuck volunteers into an fMRI machine - which can, with limited spatial and temporal resolution, determine which parts of your brain are active at any given time - and played them clips of humans and robots making identical facial expressions.

On a very basic level, the researchers were asking whether humans empathize with even obviously mechanical robots.

The results, published last week in the journal PLoS ONE, were about what you would expect: in a default scenario in which participants were told to concentrate on the motion of the facial gesture itself, their brains showed significantly reduced activation when they watched robots expressing emotion, as compared to humans doing the same thing.

But a funny thing happened when they were told to concentrate on the emotional content of the robots’ expressions: their brains evidenced significantly increased activity, including the areas that contain mirror neurons.

So when humans are asked to think about what a robot expressing an emotion might be feeling, we are instantly more likely to empathize with them. The very question - please concentrate on what the robot is feeling - presupposes that the robot even has emotions.

Whether or not the robot is actually feeling something is therefore up to us - it depends on our beliefs about the sentience or non-sentience of the robot. It’s not hard to convincingly simulate at least an animal level of emotion in robots with even the most primitive gestural vocabulary - that’s the basis of the success of robotic therapy as carried out with, for example, the robotic baby seal Paro.

Below, I’ve embedded the very same video which participants were shown when they were in the fMRI scanner. The robot itself is barely recognizable as human, and its gestures even less so, which makes it all the more intriguing that participants were able to imagine, just for an instant, that it has feelings, too. What might an even more humanoid - or more familiar - robot accomplish?

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.