In 2014, Elliot Rodger went on a shooting and stabbing spree, killing six and injuring 14 at the University of California, Santa Barbara. Rodger was a self-proclaimed “incel” (short for involuntary celibate)—a group of young men who feel furious at their perceived rejection by women and meet online to discuss and spread their ideology. Their toxic misogyny fuels a hatred for women that has led to several recent incidents of mass violence, with many incels citing Rodger’s own disturbing manifesto as an inspiration.
The authorities are taking note. Last month, the Texas Department of Public Safety released a report finding that incels “are an emerging domestic terrorism threat as current adherents demonstrate marked acts or threats of violence in furtherance of their social grievance.”
Now a group of computer scientists have painted the most complete picture yet of the misogynistic groups that fuel the incel movement online.
The “manosphere,” as it is known, is divided into four broad groups. “Men’s right’s activists” (MRAs) claim that family law and social institutions discriminate against men. “Men going their own way” (MGTOW) take this feeling of grievance further, arguing that society can’t be “amended”; they often avoid women, blaming them for their problems. “Pick-up artists” (PUAs), meanwhile, date and harass women; they believe society is “feminizing” men.
And then there are the incels, the most potentially violent of the group. Incels abide by the “black pill,” a belief that women use their sexual power to dominate men socially. For that, incels want revenge.
The team’s analysis found that the manosphere is evolving—and fast. Over the past 10 years, the population of men identifying as men’s rights activists and MGTOW—traditionally older and less violent—is falling while younger, more toxic PUA and incel communities have seen a spike.
Worryingly, it seems that there has been a significant migration from men’s rights groups to incel groups. Every year since 2015, around 8% of MRA or MGTOW members appear to have become more radicalized and joined incel groups online.
“The older [groups] are dying off,” says coauthor Jeremy Blackburn, an assistant professor at Binghamton University.
Indeed, it seems that not only are older, less violent groups dying off, but membership in the more violent groups is becoming more toxic. To determine the level of hate being espoused by these groups, the team used a machine-learning tool developed by Google, called Perspective, that looks for keywords in speech. It produces a “toxicity score” to give an idea of how much hate speech is being used in the forums.
The team’s analysis showed that speech in the most extreme manosphere groups on Reddit, known as subreddits, was far more hateful than the speech of a random sample of Reddit users, and more on the wavelength of fringe far-right hate groups like those that frequent the social network Gab. And it’s getting worse. Over time the toxicity score has risen across all manosphere forums.
To keep track of the various manosphere groups, the team had to skim seven dedicated forums, along with 57 subreddits and a number of specialized wiki groups. Many of these wikis sprang up after the groups were banned from social media for their extreme views. The team built software to scrape information on threads dating back to 2015, encompassing 138,000 users and 7.5 million posts.
The way these groups use language made the task tricky. Summer Long, a research assistant on the project, says that the extreme end of the manosphere often uses vulgarity as a self-deprecating measure, which can confuse the systems trained to look for such words.
Incels also often use seemingly innocuous language to sidestep Reddit moderators. One term that appeared often was “smv,” which stands for “sexual market value.” And one common trope is “spinning plates,” used by pickup artists who date as many women as possible. To a casual observer, those words might mean nothing. To a wannabe incel, they are a sign he’s come to the right place.
“It’s worth noting that this is a big challenge and that [our way of] measuring toxicity is not perfect,” says Blackburn, noting that Google Perspective has been shown to miss problematic language and might even exhibit racial bias. Still, he thinks that this is a major first step toward identifying people’s migration from less violent groups to more violent ones.
So what can be done? One step might be to create tools to help spot and protect potential victims, along with an earlier analysis of when and how men’s rights and MGTOW groups get radicalized, says Blackburn.
Reddit has taken steps to crack down on incel-sympathizing subreddits. For example, r/Incel has been banned since November 2017, but an alternate subreddit, r/Braincels, quickly took its place, gaining nearly 17,000 followers. It was banned in October 2019. After the publication of Blackburn and his colleagues’ paper on the arXiv preprint server, Reddit put the r/MGTOW subreddit in quarantine, which means that its content is deemed “extremely offensive or upsetting to the average redditor,” it can’t generate ad revenue, and visitors must click a pop-up saying they understand that other redditors find it offensive. But this crackdown has forced many incels toward even more extreme sites, like Gab.
Long said spending time in these forums and subreddits as a woman was “eye-opening,” and that she could see how minds are “poisoned” in an echo chamber.
“It’s horrifying,” she says. “But you can see how it molds someone’s view into being fatalistic. [It’s] a no-hope ideology.”
Editor's note: We amended the date that r/braincel was banned.
Humans and technology
Forget dating apps: Here’s how the net’s newest matchmakers help you find love
Fed up with apps, people looking for romance are finding inspiration on Twitter, TikTok—and even email newsletters.
China’s burned-out tech workers are fighting back against long hours
A viral online project helped expose the punishing 996 work schedule—and shows how hard it is to make progress against it.
Our brains exist in a state of “controlled hallucination”
Three new books lay bare the weirdness of how our brains process the world around us.
Amazon’s Astro robot is stupid. You’ll still fall in love with it.
From Jibo to Aibo, humans have a long track record of falling for their robots. Except this one’s sold by Amazon.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.