The news: Facebook will start directing people who have interacted with misinformation about coronavirus to a myth-busting page on the World Health Organization’s website. “We’re going to start showing messages in News Feed to people who have liked, reacted or commented on harmful misinformation about COVID-19 that we have since removed,” Facebook’s VP of integrity, Guy Rosen, wrote in a blog post. These messages will appear in a post the top of people’s news feeds, labeled “Help friends and family avoid false information about COVID-19.”
The context: There are a lot of harmful myths and hoaxes about covid-19 being promulgated on social media, most notably the idea that there’s a link with new 5G networks, which has spread across Europe and led to attacks on phone towers in the UK. People are also sharing dangerous falsehoods about cures and claims that the virus is some sort of man-made weapon. Human rights group Avaaz released a report this week that examined 100 pieces of misinformation on Facebook and found the posts had been shared over 1.7 million times and seen approximately 117 million times. Social-media companies have promised to take a more proactive approach to taking down misinformation about coronavirus, but the scale of the problem is huge.
What this step will (and won’t) do: This does not mean Facebook is going to alert you if you’ve viewed or shared a lie. It will only show people a WHO link in their news feed, which they can easily scroll past and ignore, or click on and not read properly. Facebook has always been loath to take an interventionist approach on fact-checking. However, in its report, Avaaz says a new study shows that if Facebook proactively “corrected the record” by providing users with corrections from fact-checkers, it could cut belief in falsehoods by an average of 50%.
Troll farms reached 140 million Americans a month on Facebook before 2020 election, internal report shows
“This is not normal. This is not healthy.”
The Facebook whistleblower says its algorithms are dangerous. Here’s why.
Frances Haugen’s testimony at the Senate hearing today raised serious questions about how Facebook’s algorithms work—and echoes many findings from our previous investigation.
She risked everything to expose Facebook. Now she’s telling her story.
Sophie Zhang, a former data scientist at Facebook, revealed that it enables global political manipulation and has done little to stop it.
Covid conspiracy theories are driving people to anti-Semitism online
Old and overtly anti-Semitic fantasies are gaining new adherents, and far-right activists have been working to convert anti-lockdown beliefs to anti-Semitism too.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.