MIT Technology Review Subscribe

Facebook’s Emotional Manipulation Study Is Just the Latest Effort to Prod Users

With emotion-triggering effort, Facebook pushes beyond data-driven studies on voting, sharing, and organ-donation prompts, to make people feel good or bad.

Facebook’s controversial study exploring whether it could manipulate people’s moods by tweaking their news feeds to favor negative or positive content produced a particularly negative emotional response, but it is far from the social network’s first effort to control user behavior.

With huge amounts of data flooding in from more than a billion users, the company has a unique position to study their every move, and to perform experiments by measuring how behavior changes under different conditions (see “What Facebook Knows”). This helps Facebook persuade users to spend more time on the site. But in the past three years it has also been probing everything from voting to the effect of encouraging people to make organ donations.

Advertisement

The company has a data science team dedicated to running experiments, both to advance its business aims and to do social-science research on the side, often with collaborators in academia. Other academics perform research on Facebook without collaborating with the company—either by simply observing users, or creating apps that ask them to take part in a project.

This story is only available to subscribers.

Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.

Subscribe now Already a subscriber? Sign in
You’ve read all your free stories.

MIT Technology Review provides an intelligent and independent filter for the flood of information about technology.

Subscribe now Already a subscriber? Sign in

The recent study, done in January 2012 but published only recently, hit a nerve partly because it had a negative effect on some users, but also because the affected users were not asked for permission to participate (agreeing to Facebook’s terms and conditions was taken as consent).

“What’s different about this study is that participants did not explicitly consent to being part of an experimental manipulation for the study, and that the results were published,” says Lorrie Cranor a computer scientist at Carnegie Mellon University, where she directs the CyLab Usable Privacy and Security Laboratory.

Facebook ran an experiment on 689,003 users to see if it could manipulate their emotions by varying the selection of posts in their news feeds. One group had stories with positive words filtered out; another experimental group had stories with negative words filtered out. Taken as a group, the people subjected to these changes tended to write posts that echoed those moods, though the effect was small.

But small effects can add up. Past Facebook studies have shown that relatively minor restructuring of its pages and prompts can have significant social effects. Perhaps most dramatically, a 2012 study showed that on Election Day in 2010, when Facebook posted reminders to vote, that action prompted 340,000 more people to vote than otherwise would have (see “How Facebook Drove Voters to the Polls”).

And in 2012, Facebook showed it might have the power to get people to donate their organs. The company put a clickable box on Timeline pages to let people indicate that they were registered donors—the campaign was associated with a huge boost to donor enrollments. (In that case, though, extensive media coverage of Facebook’s effort complicated the analysis of whether Facebook’s effort directly caused the increased enrollments.)

In some ways, Facebook’s published research is just part of a vast ongoing effort at Web-based manipulation. “What’s far more concerning is the lack of transparency about Facebook’s practices overall,” says Zeynep Tufeki, an assistant professor at the University of North Carolina, Chapel Hill, and a former fellow at the Center for Information Technology Policy at Princeton University. “I’m concerned about these practices—testing and manipulating the user experience every day. What else does Facebook do every day? We have no idea.”

Mining personal data is a billion-dollar business (see “The Data Made Me Do It”) designed to elicit purchases, garner eyeballs, and shape behavior. “Advertising and the media work to manipulate our emotions all the time, so I don’t find this study to be particularly problematic,” Cranor says. “We are all laboratory rats without being aware of it.”

Advertisement

The real issue, Cranor and others say, may be that when academic institutions are involved—researchers at Cornell University and the University of California, San Francisco, participated in the emotion study—their academic institutional review boards should take a closer look. Currently this is mainly done when federal funds are involved.

Federal policy for protecting human subjects in federally funded research, called the common rule, requires that subjects give their “informed consent,” and that a statement of the procedures includes a “description of any reasonably foreseeable risks or discomforts to the subject.” Facebook’s data use policy is far more vague, saying that it might use your data for “internal operations, including troubleshooting, data analysis, testing, research, and service improvement.”

This lack of consent is concerning to Antonio Damasio, a neuroscientist at the University of Southern California who has made key findings in the understanding of the brain processes underlying emotion. “I agree that emotion manipulation is quite common, not only on the Web but in daily life,” he says. “That is what advertising in general and marketing in particular are about, but that does not authorize researchers to conduct experiments without proper consent. I can find no excuse for this behavior and no way of condoning it.”

On Monday Facebook said it had nothing to add beyond the apology its researcher, Adam Kramer, posted on the matter.

This is your last free story.
Sign in Subscribe now

Your daily newsletter about what’s up in emerging technology from MIT Technology Review.

Please, enter a valid email.
Privacy Policy
Submitting...
There was an error submitting the request.
Thanks for signing up!

Our most popular stories

Advertisement