The news: Facebook announced on Tuesday that it will remove “any Facebook Pages, Groups and Instagram accounts representing QAnon, even if they contain no violent content.” QAnon, the pro-Trump conspiracy theory centering on the belief that the president of the United States is at war with a secret satanic pedophile ring run by liberals, has grown into an “omniconspiracy” in recent months. Accordingly, it has become a powerful distributor of conspiratorial thinking on a variety of topics—including misinformation about the pandemic and the presidential elections.
The context: This goes further than the less intense ban announced in August. At the time, Facebook said it would remove pages, groups, and accounts containing “discussions of potential violence.” By then, QAnon had inspired a growing list of destructive, sometimes violent, acts. In 2019, the FBI concluded that QAnon was potentially capable of inspiring violence.
Why now? QAnon flourished for years on social media before this summer, and many critics felt that Facebook’s partial ban was too little, too late. But it was likely prompted by the theory’s staggering growth on social media since March (an internal Facebook study this summer found that QAnon-associated groups had millions of members). Today’s announcement referred to QAnon’s involvement in spreading dangerous misinformation during the wildfires in the western United States as another reason for the more aggressive ban.
Brian Friedberg, a senior researcher at the Harvard Shorenstein Center’s Technology and Social Change Project who has been tracking QAnon since its early days, said in a text message that while the announcement will likely fuel rumors among QAnon supporters that this ban amounts to “election interference” against Trump, the timing suggests that Facebook is trying to “AVOID further spread of election disinfo” from QAnon’s distribution networks by acting now.
QAnon believers were expecting this: Although QAnon has a large presence on Facebook, its believers are present on most social-media platforms, and believers have been talking about a more intense Facebook and Twitter crackdown for a while. They had time to prepare, and at this point, they have some experience learning how to work around bans. For instance, the “Q” account at the center of the conspiracy theory recently instructed followers to “camouflage” themselves online and drop references to “Q” or “QAnon” in order to avoid bans targeting those keywords. The community’s immediate reaction to Facebook’s announcement, Friedberg said, was to use Twitter to promote alternative locations for QAnon believers to organize online. Gab, a social-media site that is popular with the far right, has already started to court QAnon believers and influencers.
What’s next for AI regulation in 2024?
The coming year is going to see the first sweeping AI laws enter into force, with global efforts to hold tech companies accountable.
Meet the economist who wants the field to account for nature
Gretchen Daily is working to make the environment more of an element in economic decision-making.
Three technology trends shaping 2024’s elections
The biggest story of this year will be elections in the US and all around the globe
Four lessons from 2023 that tell us where AI regulation is going
What we should expect in the coming 12 months in AI policy
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.