You probably think you’re a pretty nice person when it comes to making comments online, but new research indicates anyone can act like a troll under the right circumstances—even you.
Online harassment has been a problem for years, and it has only gotten worse, spreading from social networks like Twitter to the furthest reaches of the Internet.
So who, exactly, are these trolls? According to an experiment conducted by Stanford and Cornell researchers, it could be any one of us: they determined that being in a bad mood and seeing troll-like posts from other people on online articles made it more likely that people would then type nasty comments themselves. A paper on the work will be presented at a conference in Portland, Oregon, in late February.
The researchers say their work is meant to challenge the idea that the people spouting all this negativity are all antisocial, sitting in dark rooms writing comments on discussion forums and social networks. They also think it may be used to help predict when trolling is likely to occur.
To conduct their work, researchers first had to set the mood, so to speak. They did this by giving participants a quiz to complete in a set amount of time; some people received a trickier quiz in hopes of irritating them, researchers said, and others an easier one in hopes of making them happier. Participants then answered a list of questions meant to quantify their mood.
After that, they took part in an online discussion where they saw an article related to the presidential election that either had benign or troll-like comments tacked on to it. Researchers found the highest number of trolling posts occurred when people were in a negative mood and saw other mean comments already added to an article. Specifically, they determined that being in a bad mood raised the chances that someone would troll by 89 percent, and that seeing other people’s invective increased the chances 68 percent.
The researchers also analyzed 16 million comments on CNN’s website: a quarter of the posts flagged as abusive were written by people who hadn’t done that kind of thing in the past, and once a negative post appeared on an article, more negative posts tended to follow. They also found that the most negative behavior occurred in the evenings, and on Mondays—both times when research has already indicated that people’s moods may be worse.
But Kathryn Seigfried-Spellar, an assistant professor at Purdue University who studies trolling and online bullying, points out that just because someone makes a negative comment doesn’t always mean they’re trolling.
“I might actually feel that way. My profanity is an expression of an attitude that I hold, rather than it being something I’m trying to divert or upset you with,” she says.
This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist
An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.
Data analytics reveal real business value
Sophisticated analytics tools mine insights from data, optimizing operational processes across the enterprise.
Driving companywide efficiencies with AI
Advanced AI and ML capabilities revolutionize how administrative and operations tasks are done.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.