Skip to Content
Silicon Valley

Troll farms reached 140 million Americans a month on Facebook before 2020 election, internal report shows

“This is not normal. This is not healthy.”

September 16, 2021
MIT Technology Review | Envato elements

Update: After this story published, Facebook took down two of the five troll-farm pages we identified.

In the run-up to the 2020 election, the most highly contested in US history, Facebook’s most popular pages for Christian and Black American content were being run by Eastern European troll farms. These pages were part of a larger network that collectively reached nearly half of all Americans, according to an internal company report, and achieved that reach not through user choice but primarily as a result of Facebook’s own platform design and engagement-hungry algorithm.

The report, written in October 2019 and obtained by MIT Technology Review from a former Facebook employee not involved in researching it, found that after the 2016 election, Facebook failed to prioritize fundamental changes to how its platform promotes and distributes information. The company instead pursued a whack-a-mole strategy that involved monitoring and quashing the activity of bad actors when they engaged in political discourse, and adding some guardrails that prevented “the worst of the worst.”

But this approach did little to stem the underlying problem, the report noted. Troll farms—professionalized groups that work in a coordinated fashion to post provocative content, often propaganda, to social networks—were still building massive audiences by running networks of Facebook pages. Their content was reaching 140 million US users per month—75% of whom had never followed any of the pages. They were seeing the content because Facebook’s content-recommendation system had pushed it into their news feeds.

“Instead of users choosing to receive content from these actors, it is our platform that is choosing to give [these troll farms] an enormous reach,” wrote the report’s author, Jeff Allen, a former senior-level data scientist at Facebook.

Joe Osborne, a Facebook spokesperson, said in a statement that the company “had already been investigating these topics” at the time of Allen’s report, adding: “Since that time, we have stood up teams, developed new policies, and collaborated with industry peers to address these networks. We’ve taken aggressive enforcement actions against these kinds of foreign and domestic inauthentic groups and have shared the results publicly on a quarterly basis.”

In the process of fact-checking this story shortly before publication, MIT Technology Review found that five of the troll-farm pages mentioned in the report remained active.

This is the largest troll-farm page targeting African-Americans in October 2019. It still remains active on Facebook.

The report found that troll farms were reaching the same demographic groups singled out by the Kremlin-backed Internet Research Agency (IRA) during the 2016 election, which had targeted Christians, Black Americans, and Native Americans. A 2018 BuzzFeed News investigation found that at least one member of the Russian IRA, indicted for alleged interference in the 2016 US election, had also visited Macedonia around the emergence of its first troll farms, though it didn’t find concrete evidence of a connection. (Facebook said its investigations hadn’t turned up a connection between the IRA and Macedonian troll farms either.)

“This is not normal. This is not healthy,” Allen wrote. “We have empowered inauthentic actors to accumulate huge followings for largely unknown purposes ... The fact that actors with possible ties to the IRA have access to huge audience numbers in the same demographic groups targeted by the IRA poses an enormous risk to the US 2020 election.”

As long as troll farms found success in using these tactics, any other bad actor could too, he continued: “If the Troll Farms are reaching 30M US users with content targeted to African Americans, we should not at all be surprised if we discover the IRA also currently has large audiences there.”

Allen wrote the report as the fourth and final installment of a year-and-a-half-long effort to understand troll farms. He left the company that same month, in part because of frustration that leadership had “effectively ignored” his research, according to the former Facebook employee who supplied the report. Allen declined to comment.

The report reveals the alarming state of affairs in which Facebook leadership left the platform for years, despite repeated public promises to aggressively tackle foreign-based election interference. MIT Technology Review is making the full report available, with employee names redacted, because it is in the public interest.

Its revelations include:

  • As of October 2019, around 15,000 Facebook pages with a majority US audience were being run out of Kosovo and Macedonia, known bad actors during the 2016 election.
  • Collectively, those troll-farm pages—which the report treats as a single page for comparison purposes—reached 140 million US users monthly and 360 million global users weekly. Walmart’s page reached the second-largest US audience at 100 million.
  • The troll farm pages also combined to form:
    • the largest Christian American page on Facebook, 20 times larger than the next largest—reaching 75 million US users monthly, 95% of whom had never followed any of the pages.
    • the largest African-American page on Facebook, three times larger than the next largest—reaching 30 million US users monthly, 85% of whom had never followed any of the pages.
    • the second-largest Native American page on Facebook, reaching 400,000 users monthly, 90% of whom had never followed any of the pages.
    • the fifth-largest women’s page on Facebook, reaching 60 million US users monthly, 90% of whom had never followed any of the pages.
  • Troll farms primarily affect the US but also target the UK, Australia, India, and Central and South American countries.
  • Facebook has conducted multiple studies confirming that content more likely to receive user engagement (likes, comments, and shares) is more likely of a type known to be bad. Still, the company has continued to rank content in user’s newsfeeds according to what will receive the highest engagement.
  • Facebook forbids pages from posting content merely copied and pasted from other parts of the platform but does not enforce the policy against known bad actors. This makes it easy for foreign actors who do not speak the local language to post entirely copied content and still reach a massive audience. At one point, as many as 40% of page views on US pages went to those featuring primarily unoriginal content or material of limited originality.
  • Troll farms previously made their way into Facebook’s Instant Articles and Ad Breaks partnership programs, which are designed to help news organizations and other publishers monetize their articles and videos. At one point, thanks to a lack of basic quality checks, as many as 60% of Instant Article reads were going to content that had been plagiarized from elsewhere. This made it easy for troll farms to mix in unnoticed, and even receive payments from Facebook.

How Facebook enables troll farms and grows their audiences

The report looks specifically at troll farms based in Kosovo and Macedonia, which are run by people who don’t necessarily understand American politics. Yet because of the way Facebook’s newsfeed reward systems are designed, they can still have a significant impact on political discourse.

In the report, Allen identifies three reasons why these pages are able to gain such large audiences. First, Facebook doesn’t penalize pages for posting completely unoriginal content. If something has previously gone viral, it will likely go viral again when posted a second time. This makes it really easy for anyone to build a massive following among Black Americans, for example. Bad actors can simply copy viral content from Black Americans’ pages, or even Reddit and Twitter, and paste it onto their own page—or sometimes dozens of pages.

Second, Facebook pushes engaging content on pages to people who don’t follow them. When users’ friends comment on or reshare posts on one of these pages, those users will see it in their newsfeeds too. The more a page’s content is commented on or shared, the more it travels beyond its followers. This means troll farms, whose strategy centers on reposting the most engaging content, have an outsize ability to reach new audiences.

Third, Facebook’s ranking system pushes more engaging content higher up in users’ newsfeeds. For the most part, the people who run troll farms have financial rather than political motives; they post whatever receives the most engagement, with little regard to the actual content. But because misinformation, clickbait, and politically divisive content is more likely to receive high engagement (as Facebook’s own internal analyses acknowledge), troll farms gravitate to posting more of it over time, the report says.

As a result, in October 2019, all 15 of the top pages targeting Christian Americans, 10 of the top 15 Facebook pages targeting Black Americans, and four of the top 12 Facebook pages targeting Native Americans were being run by troll farms.

“Our platform has given the largest voice in the Christian American community to a handful of bad actors, who, based on their media production practices, have never been to church,” Allen wrote. “Our platform has given the largest voice in the African American community to a handful of bad actors, who, based on their media production practices, have never had an interaction with an African American.”

“It will always strike me as profoundly weird ... and genuinely horrifying,” he wrote. “It seems quite clear that until that situation can be fixed, we will always be feeling serious headwinds in trying to accomplish our mission.”

The report also suggested a possible solution. “This is far from the first time humanity has fought bad actors in our media ecosystems,” he wrote, pointing to Google’s use of what’s known as a graph-based authority measure—which assesses the quality of a web page according to how often it cites and is cited by other quality web pages—to demote bad actors in its search rankings.

“We have our own implementation of a graph-based authority measure,” he continued. If the platform gave more consideration to this existing metric in ranking pages, it could help flip the disturbing trend in which pages reach the widest audiences.

When Facebook’s rankings prioritize engagement, troll-farm pages beat out authentic pages, Allen wrote. But “90% of Troll Farm Pages have exactly 0 Graph Authority … [Authentic pages] clearly win.”

Systemic issues

A search of all the troll-farm pages listed in the report reveals that five are still active nearly two years later:

  • A page called “My Baby Daddy Ain’t Shit,” which was the largest Facebook page targeting African-Americans in October 2019.
  • A page called “Savage Hood,” targeting African-Americans.
  • A page called “Hood Videos,” targeting African-Americans.
  • A page called “Purpose of Life,” targeting Christians.
  • A page called “Eagle Spirit,” targeting Native Americans.
A troll-farm page targeting Christian Americans.

Facebook’s recent controversial “Widely Viewed Content” report suggests that some of the core vulnerabilities the troll farms exploited also remain. Fifteen of the 19 most viewed posts listed in the report were plagiarized from other posts that had previously gone viral on Facebook or another platform, according to an analysis from Casey Newton at The Verge.

Samantha Bradshaw, a postdoctoral research fellow at Stanford University who studies the intersection of disinformation, social media, and democracy, says the report “speaks to a lot of the deeper systemic problems with the platform and their algorithm in the way that they promote certain kinds of content to certain users, all just based on this underlying value of growth.” If those are not fixed, they will continue to create distorted, economic incentives for bad actors, she adds: “That’s the problem.”

Read the full report here:

Keep Reading

Most Popular

Scientists are finding signals of long covid in blood. They could lead to new treatments.

Faults in a certain part of the immune system might be at the root of some long covid cases, new research suggests.

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.