Skip to Content
Election 2020

Facebook should at least label lying political ads

Facebook’s refusal to even touch false political ads sends the message that it only cares about lies spread by regular users and not politicians.
October 22, 2019
Facebook
Facebook
FacebookFacebook

Yesterday, Facebook revealed its plan for fighting disinformation ahead of the 2020 US election. It includes spending $2 million on a media literacy project, making it easier to research political ads, and using more prominent fact-checking labels. Each step is commendable, but it all seems hypocritical coming from a company that refuses to do anything about political ads that contain false information

The message seems to be that Facebook is very concerned with preventing falsehoods—but only when they are spread by regular users and not by the people who might be elected to positions of real power. At the same time, CEO Mark Zuckerberg was right when he said during a speech last week that “I don’t think most people want to want to live in a world where you can only post things that tech companies judge to be 100% true.”

But there’s a middle ground between Facebook deciding what everyone is allowed to see and letting politicians lie as they wish. Facebook should revisit its policy of not touching political content and instead put one of those new, prominent labels on top of political ads that contain false information (like the Trump campaign ad that lied about Joe Biden, or the fake Facebook ad that Elizabeth Warren bought to goad Zuckerberg). That way, the company can keep the ads up without letting falsehoods spread unnoticed, which is especially important because political ads are often microtargeted at communities that might be most likely to believe them.

To be clear, Facebook’s third-party fact-checking program has not been a panacea for the problem of disinformation. An enormous amount of content is posted every day, far too much for everything to be fact-checked. There are people who won’t trust the fact-checkers, and so a label is meaningless to them.

Facebook’s own execution leaves much to be desired as well. In July, the fact-checking platform Full Fact, one of Facebook’s partners, released a report criticizing the company for not sharing enough data and not responding quickly enough to content flagged as false. But to the extent that fact-checking is valuable (and the Full Fact report concluded that it was), political ads should be among the most carefully fact-checked, not the least.

Zuckerberg argues that the company avoids fact-checking politicians “because we think people should be able to see for themselves what politicians are saying.” But most people are not going to bother to fact-check a political ad or seek out journalism elsewhere debunking it. As a result, Facebook’s hands-off policy is not actually neutral. It favors, and helps support, candidates who have no qualms about lying and spreading conspiracy theories. The worst players win.

Having a specific fact-checking team dedicated to political ads could address many of these issues. Facebook already knows which ads are paid for by political campaigns. It’s not an endless content stream. Fact-checking ads wouldn’t make Facebook a censor. It also wouldn’t “prevent a politician’s speech from reaching its audience,” as Facebook spokesperson Nick Clegg fears. It would ensure that the people who come across the ad are able to “see for themselves” what politicians are saying, and also see for themselves which politicians are comfortable with bald-faced lies.

Keep Reading

Most Popular

light and shadow on floor
light and shadow on floor

How Facebook and Google fund global misinformation

The tech giants are paying millions of dollars to the operators of clickbait pages, bankrolling the deterioration of information ecosystems around the world.

This new startup has built a record-breaking 256-qubit quantum computer

QuEra Computing, launched by physicists at Harvard and MIT, is trying a different quantum approach to tackle impossibly hard computational tasks.

wet market selling fish
wet market selling fish

This scientist now believes covid started in Wuhan’s wet market. Here’s why.

How a veteran virologist found fresh evidence to back up the theory that covid jumped from animals to humans in a notorious Chinese market—rather than emerged from a lab leak.

protein structures
protein structures

DeepMind says it will release the structure of every protein known to science

The company has already used its protein-folding AI, AlphaFold, to generate structures for the human proteome, as well as yeast, fruit flies, mice, and more.

Stay connected

Illustration by Rose WongIllustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.