Skip to Content
Silicon Valley

Facebook is stepping up its efforts to debunk coronavirus falsehoods

April 17, 2020
Photo by Joshua Hoehne | Unsplash

The news: Facebook will start directing people who have interacted with misinformation about coronavirus to a myth-busting page on the World Health Organization’s website. “We’re going to start showing messages in News Feed to people who have liked, reacted or commented on harmful misinformation about COVID-19 that we have since removed,” Facebook’s VP of integrity, Guy Rosen, wrote in a blog post. These messages will appear in a post the top of people’s news feeds, labeled “Help friends and family avoid false information about COVID-19.”

The context: There are a lot of harmful myths and hoaxes about covid-19 being promulgated on social media, most notably the idea that there’s a link with new 5G networks, which has spread across Europe and led to attacks on phone towers in the UK. People are also sharing dangerous falsehoods about cures and claims that the virus is some sort of man-made weapon. Human rights group Avaaz released a report this week that examined 100 pieces of misinformation on Facebook and found the posts had been shared over 1.7 million times and seen approximately 117 million times. Social-media companies have promised to take a more proactive approach to taking down misinformation about coronavirus, but the scale of the problem is huge.

What this step will (and won’t) do: This does not mean Facebook is going to alert you if you’ve viewed or shared a lie. It will only show people a WHO link in their news feed, which they can easily scroll past and ignore, or click on and not read properly. Facebook has always been loath to take an interventionist approach on fact-checking. However, in its report, Avaaz says a new study shows that if Facebook proactively “corrected the record” by providing users with corrections from fact-checkers, it could cut belief in falsehoods by an average of 50%.

Deep Dive

Silicon Valley

Frances Haugen testifies during a Senate Committee
Frances Haugen testifies during a Senate Committee

The Facebook whistleblower says its algorithms are dangerous. Here’s why.

Frances Haugen’s testimony at the Senate hearing today raised serious questions about how Facebook’s algorithms work—and echoes many findings from our previous investigation.

Sophie Zhang
Sophie Zhang

She risked everything to expose Facebook. Now she’s telling her story.

Sophie Zhang, a former data scientist at Facebook, revealed that it enables global political manipulation and has done little to stop it.

photograph of someones badge saying "hands off my dna!"
photograph of someones badge saying "hands off my dna!"

Covid conspiracy theories are driving people to anti-Semitism online

Old and overtly anti-Semitic fantasies are gaining new adherents, and far-right activists have been working to convert anti-lockdown beliefs to anti-Semitism too.

Stay connected

Illustration by Rose WongIllustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.