The news: Facebook announced on Tuesday that it will remove “any Facebook Pages, Groups and Instagram accounts representing QAnon, even if they contain no violent content.” QAnon, the pro-Trump conspiracy theory centering on the belief that the president of the United States is at war with a secret satanic pedophile ring run by liberals, has grown into an “omniconspiracy” in recent months. Accordingly, it has become a powerful distributor of conspiratorial thinking on a variety of topics—including misinformation about the pandemic and the presidential elections.
The context: This goes further than the less intense ban announced in August. At the time, Facebook said it would remove pages, groups, and accounts containing “discussions of potential violence.” By then, QAnon had inspired a growing list of destructive, sometimes violent, acts. In 2019, the FBI concluded that QAnon was potentially capable of inspiring violence.
Why now? QAnon flourished for years on social media before this summer, and many critics felt that Facebook’s partial ban was too little, too late. But it was likely prompted by the theory’s staggering growth on social media since March (an internal Facebook study this summer found that QAnon-associated groups had millions of members). Today’s announcement referred to QAnon’s involvement in spreading dangerous misinformation during the wildfires in the western United States as another reason for the more aggressive ban.
Brian Friedberg, a senior researcher at the Harvard Shorenstein Center’s Technology and Social Change Project who has been tracking QAnon since its early days, said in a text message that while the announcement will likely fuel rumors among QAnon supporters that this ban amounts to “election interference” against Trump, the timing suggests that Facebook is trying to “AVOID further spread of election disinfo” from QAnon’s distribution networks by acting now.
QAnon believers were expecting this: Although QAnon has a large presence on Facebook, its believers are present on most social-media platforms, and believers have been talking about a more intense Facebook and Twitter crackdown for a while. They had time to prepare, and at this point, they have some experience learning how to work around bans. For instance, the “Q” account at the center of the conspiracy theory recently instructed followers to “camouflage” themselves online and drop references to “Q” or “QAnon” in order to avoid bans targeting those keywords. The community’s immediate reaction to Facebook’s announcement, Friedberg said, was to use Twitter to promote alternative locations for QAnon believers to organize online. Gab, a social-media site that is popular with the far right, has already started to court QAnon believers and influencers.
How conservative Facebook groups are changing what books children read in school
Parents are gathering online to review books and lobby schools to ban them, often on the basis of sexual content.
Why can’t tech fix its gender problem?
A new generation of tech activists, organizers, and whistleblowers, most of whom are female, non-white, gender-diverse, or queer, may finally bring change.
How the idea of a “transgender contagion” went viral—and caused untold harm
A single paper on the notion that gender dysphoria can spread among young people helped galvanize an anti-trans movement.
The world is moving closer to a new cold war fought with authoritarian tech
At the Shanghai Cooperation Organization summit, Iran, Turkey, and Myanmar promised tighter trade relationships with Russia and China.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.