The news: Pages spreading health misinformation got an estimated 3.8 billion views on Facebook this year as of May, according to analysis by human rights group Avaaz. Views peaked at nearly half a billion in April 2020 alone, just as the pandemic was rapidly escalating globally, it found. Content from 10 websites spreading health misinformation received almost four times as many estimated views on Facebook as content from 10 reliable sources, such as the World Health Organization and the US Centers for Disease Control and Prevention. In all, Avaaz studied 82 websites that spread health misinformation like anti-vaccination campaigns, claims of bogus (and sometimes lethally dangerous) cures for covid-19, and articles falsely charging that the coronavirus death toll has been overcounted.
The impact: This is by no means a victimless issue. A study in the American Journal of Tropical Medicine and Hygiene last week found that around the world, at least 800 people may have died and 5,800 been admitted to hospital as a result of coronavirus misinformation in the first three months of 2020, many of them after drinking methanol or cleaning products that they believed could cure covid-19.
What can be done about it? Arguably, Facebook could just ban the websites Avaaz has identified from its platform. The company promised to start taking a more proactive approach to fact-checking and removing covid-19 misinformation in April, but stopped short at saying it would alert people who have viewed or shared falsehoods. In its report, Avaaz says that taking this step could start to make a big dent in the number of people who believe misinformation. It also says that Facebook needs to “detox” its algorithm by downgrading misinformation posts in people’s news feeds, thus decreasing their reach. “Facebook has yet to effectively apply these solutions at the scale and sophistication needed to defeat this infodemic, despite repeated calls from doctors and health experts to do so,” the report concludes.
Facebook's response: A company spokesperson said that the firm applied warning labels to 98 million pieces of COVD-19 misinformation from April to June this year. “We share Avaaz’s goal of limiting misinformation, but their findings don’t reflect the steps we’ve taken to keep it from spreading on our services,” they said.
Troll farms reached 140 million Americans a month on Facebook before 2020 election, internal report shows
“This is not normal. This is not healthy.”
The Facebook whistleblower says its algorithms are dangerous. Here’s why.
Frances Haugen’s testimony at the Senate hearing today raised serious questions about how Facebook’s algorithms work—and echoes many findings from our previous investigation.
She risked everything to expose Facebook. Now she’s telling her story.
Sophie Zhang, a former data scientist at Facebook, revealed that it enables global political manipulation and has done little to stop it.
Covid conspiracy theories are driving people to anti-Semitism online
Old and overtly anti-Semitic fantasies are gaining new adherents, and far-right activists have been working to convert anti-lockdown beliefs to anti-Semitism too.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.