Disrupting brain activity with magnets can alter moral judgments
To make moral judgments about other people, we often need to infer their intentions–an ability known as “theory of mind.” For example, if someone shoots a companion on a hunting trip, we need to understand what the shooter was thinking before we condemn him. Was he secretly jealous, or did he mistake his fellow hunter for an animal?
MIT neuroscientists have now shown that they can influence those judgments by interfering with activity in a specific brain region–a finding that helps reveal how the brain constructs morality. In the new study, published in Proceedings of the National Academy of Sciences, the researchers used a magnetic field to disrupt activity in the right temporoparietal junction (TPJ). The stimulation appeared to influence subsequent judgments that required an understanding of other people’s intentions.
The findings offer “striking evidence” that the right TPJ, located at the brain’s surface above and behind the right ear, is critical for making moral judgments, says Liane Young, the paper’s lead author and a postdoctoral associate in brain and cognitive sciences. It’s also startling, since normally people are very confident and consistent in such judgments, she adds. “You think of morality as being a really high-level behavior,” Young says. “To be able to apply [a magnetic field] to a specific brain region and change people’s moral judgments is really astonishing.”
Study author Rebecca Saxe, PhD ‘03, an assistant professor of brain and cognitive sciences, first identified the right TPJ’s role in theory of mind a decade ago as a doctoral student. Since then, she has used functional magnetic resonance imaging (fMRI) to show that the right TPJ is active when people are asked to make judgments that require thinking about other people’s intentions.
In this study, the researchers used noninvasive transcranial magnetic stimulation (TMS) to interfere with brain activity in the right TPJ. A magnetic field applied to a small area of the skull creates weak electric currents that temporarily impede nearby brain cells’ ability to fire normally.
Volunteers exposed to TMS were asked to judge, for example, how acceptable it is for a man to fail to warn his girlfriend about walking across a bridge he believes to be unsafe, even if she ends up making it across safely. When the right TPJ was disrupted, subjects were more likely to judge these scenarios as morally permissible. The researchers believe that TMS interfered with subjects’ ability to interpret others’ intentions, forcing them to rely more on outcomes to make their judgments. “It doesn’t completely reverse people’s moral judgments,” Saxe says. “It just biases them.”
Be the leader your company needs. Implement ethical AI.
Join us at EmTech Digital 2019.