The news: Facebook announced on Tuesday that it will remove “any Facebook Pages, Groups and Instagram accounts representing QAnon, even if they contain no violent content.” QAnon, the pro-Trump conspiracy theory centering on the belief that the president of the United States is at war with a secret satanic pedophile ring run by liberals, has grown into an “omniconspiracy” in recent months. Accordingly, it has become a powerful distributor of conspiratorial thinking on a variety of topics—including misinformation about the pandemic and the presidential elections.
The context: This goes further than the less intense ban announced in August. At the time, Facebook said it would remove pages, groups, and accounts containing “discussions of potential violence.” By then, QAnon had inspired a growing list of destructive, sometimes violent, acts. In 2019, the FBI concluded that QAnon was potentially capable of inspiring violence.
Why now? QAnon flourished for years on social media before this summer, and many critics felt that Facebook’s partial ban was too little, too late. But it was likely prompted by the theory’s staggering growth on social media since March (an internal Facebook study this summer found that QAnon-associated groups had millions of members). Today’s announcement referred to QAnon’s involvement in spreading dangerous misinformation during the wildfires in the western United States as another reason for the more aggressive ban.
Brian Friedberg, a senior researcher at the Harvard Shorenstein Center’s Technology and Social Change Project who has been tracking QAnon since its early days, said in a text message that while the announcement will likely fuel rumors among QAnon supporters that this ban amounts to “election interference” against Trump, the timing suggests that Facebook is trying to “AVOID further spread of election disinfo” from QAnon’s distribution networks by acting now.
QAnon believers were expecting this: Although QAnon has a large presence on Facebook, its believers are present on most social-media platforms, and believers have been talking about a more intense Facebook and Twitter crackdown for a while. They had time to prepare, and at this point, they have some experience learning how to work around bans. For instance, the “Q” account at the center of the conspiracy theory recently instructed followers to “camouflage” themselves online and drop references to “Q” or “QAnon” in order to avoid bans targeting those keywords. The community’s immediate reaction to Facebook’s announcement, Friedberg said, was to use Twitter to promote alternative locations for QAnon believers to organize online. Gab, a social-media site that is popular with the far right, has already started to court QAnon believers and influencers.
This is the real story of the Afghan biometric databases abandoned to the Taliban
By capturing 40 pieces of data per person—from iris scans and family links to their favorite fruit—a system meant to cut fraud in the Afghan security forces may actually aid the Taliban.
The covid tech that is intimately tied to China’s surveillance state
Heat-sensing cameras and face recognition systems may help fight covid-19—but they also make us complicit in the high-tech oppression of Uyghurs.
How Amazon Ring uses domestic violence to market doorbell cameras
Partnerships with law enforcement give smart cameras to the survivors of domestic violence. But who does it really help?
Why you should be more concerned about internet shutdowns
Governments are turning off the internet to silence dissenters at an ‘exponential’ rate—and threatening civil society, says the chief operating officer of Google’s Jigsaw project.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.