Skip to Content

The Limits of Fact-Checking Facebook

Don’t expect truth to suddenly have a resurgence now that the world’s largest social network is going to flag certain dubious material.
December 20, 2016

“Everyone is entitled to his own opinion, but not his own facts.” The late New York senator Daniel Patrick Moynihan’s ethical maxim got a workout in 2016, and it may face its stiffest test yet now that Facebook will use fact-checking organizations to help debunk articles shared on its network.

Facebook’s move comes in response to the outcry over the persistence of hoax stories during the recent election campaign. But not all such stories will be vetted now. Although fact-checkers typically respond to political claims of all kinds, rating them true or false or somewhere in between, Facebook’s new partnership is focused strictly on identifying the “worst of the worst” bogus content being churned out in the U.S. and abroad to generate clicks and ad dollars. One such fake story said Republicans are planning to cut Social Security “As Much As 50% Immediately”; another claimed that President Obama planned to “Immediately Pay Blacks ‘Reparations.’”

“We’re going to be looking at very clear-cut falsehoods,” says Aaron Sharockman, executive director of PolitiFact, one of seven organizations on board with the new feature, the others being ABC News, the Associated Press, FactCheck.org, Snopes, Climate Feedback, and the Washington Post. “If there are things that might be misleading, or spinning the truth, we’re probably going to avoid them as part of this partnership.”

This means that large numbers of misleading and manipulative articles will remain untouched on Facebook as long as they have some basis in reality. “There are other forums where fact-checkers can make their cases on the gray areas and half-truths,” says Alexios Mantzarlis, director of the International Fact-Checking Network at Poynter, whose “Code of Principles” is being used to guide Facebook’s stable of trusted third parties. (None of the organizations involved is being compensated in connection with the new feature.)

The structure of Facebook has made it uniquely fertile ground for propagating misinformation. While a person might be savvy enough to identify a spoofed news site that comes up in a Google search and discount the information accordingly, Facebook’s walled garden has meant that, until now, fake news sites have appeared no differently in users’ feeds than legitimate ones. Add to that the fact that information on Facebook is shared among trusted friends and family members, and it’s no surprise that a post-election Ipsos poll conducted for BuzzFeed News found that people more likely to look to Facebook for their news were also more likely to rate bogus headlines as accurate.

Facebook is clearly hesitant to act as an information gatekeeper, especially given the controversy that erupted over accusations that human editors contracted by the company suppressed conservative news in the site’s trending news section (a dubious criticism that itself has been subject to fact-check). Articles deemed fake will be downgraded in the social network’s feeds, but not erased entirely. Facebook users attempting to share these articles will be prompted by a notice stating: “Before you share this story, you might want to know that independent fact-checkers disputed its accuracy.”

How users will respond to these new notices is an open question. Marking an article as “disputed” as opposed to “false” could play into the partisan narrative of there being two sides to every claim. In addition, a notice that fact-checkers from mainstream media organizations like the Washington Post and the Associated Press have flagged a post may well be a point in an article’s favor in the eyes of many. According to research conducted for the American Press Institute last year, Republicans tend not to view fact-checkers as favorably as Democrats.

Facebook’s cautious approach might be the only practical one, however, given the backlash it’s already received over its first tentative steps. “It’s gone from ‘Oh my god, Facebook isn’t doing anything’ to ‘Oh my god, Facebook is censoring everything,’” says Mantzarlis.

Besides, swimming against the tide is nothing new for fact-checkers, says Lucas Graves, a professor at the University of Wisconsin who published Deciding What’s True: The Rise of Political Fact-Checking in American Journalism in September. “A fact-check never yields the immediate and decisive impact that we might hope for in an ideal world,” Graves says. “We always imagine that you can expose a claim as being false, and people will stop believing it and politicians will stop repeating it, but it doesn’t work that way.”

Matt Mahoney, a freelance fact-checker, has been a staff researcher at MIT Technology Review and the Boston Globe.

Keep Reading

Most Popular

Scientists are finding signals of long covid in blood. They could lead to new treatments.

Faults in a certain part of the immune system might be at the root of some long covid cases, new research suggests.

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.