Skip to Content

She risked everything to expose Facebook. Now she’s telling her story.

Sophie Zhang, a former data scientist at Facebook, revealed that it enables global political manipulation and has done little to stop it.

Sophie Zhang
Christie Hemm Klok
July 29, 2021

The world first learned of Sophie Zhang in September 2020, when BuzzFeed News obtained and published highlights from an abridged version of her nearly 8,000-word exit memo from Facebook.

Before she was fired, Zhang was officially employed as a low-level data scientist at the company. But she had become consumed by a task she deemed more important: finding and taking down fake accounts and likes that were being used to sway elections globally.

Her memo revealed that she’d identified dozens of countries, including India, Mexico, Afghanistan, and South Korea, where this type of abuse was enabling politicians to mislead the public and gain power. It also revealed how little the company had done to mitigate the problem, despite Zhang’s repeated efforts to bring it to the attention of leadership.

LISTEN TO THE STORY HERE

“I know that I have blood on my hands by now,” she wrote.

On the eve of her departure, Zhang was still debating whether to write the memo at all. It was perhaps her last chance to create enough internal pressure on leadership to start taking the problems seriously. In anticipation of writing it, she had turned down a nearly $64,000 severance package that would have involved signing a nondisparagement agreement. She wanted to retain the freedom to speak critically about the company.

But it was just two months before the 2020 US election, and she was disturbed by the idea that the memo could erode the public’s trust in the electoral process if prematurely released to the press. “I was terrified of somehow becoming the James Comey of 2020,” she says, referring to the former FBI director who, days before the 2016 election, told Congress the agency had reopened an investigation into Hillary Clinton’s use of a private email server. Clinton went on to blame Comey for her loss.

To Zhang’s great relief, that didn’t happen. And after the election passed, she proceeded with her original plan. In April, she came forward in two Guardian articles with her face, her name, and even more detailed documentation of the political manipulation she’d uncovered and Facebook’s negligence in dealing with it.

Her account supplied concrete evidence to support what critics had long been saying on the outside: that Facebook makes election interference easy, and that unless such activity hurts the company’s business interests, it can’t be bothered to fix the problem.

In a statement, Joe Osborne, a Facebook spokesperson, vehemently denied these claims. “For the countless press interviews she’s done since leaving Facebook, we have fundamentally disagreed with Ms. Zhang’s characterization of our priorities and efforts to root out abuse on our platform,” he said. “We aggressively go after abuse around the world and have specialized teams focused on this work. As a result, we’ve already taken down more than 150 networks of coordinated inauthentic behavior … Combatting coordinated inauthentic behavior is our priority.”

By going public and eschewing anonymity, Zhang risked legal action from the company, harm to her future career prospects, and perhaps even reprisals from the politicians she exposed in the process. “What she did is very brave,” says Julia Carrie Wong, the Guardian reporter who published her revelations.

After nearly a year of avoiding personal questions, Zhang is now ready to tell her story. She wants the world to understand how she became so involved in trying to protect democracy worldwide and why she cared so deeply. She’s also tired of being in the closet as a transgender woman, a core aspect of her identity that informed her actions at Facebook and after she left.

Her story reveals that it is really pure luck that we now know so much about how Facebook enables election interference globally. Zhang was not just the only person fighting this form of political manipulation; it wasn’t even her job. She had discovered the problem because of a unique confluence of skills and passion, and then taken it upon herself, driven by an extraordinary sense of moral responsibility.

To regulators around the world considering how to rein in the company, this should be a wake-up call.

Zhang never planned to be in this position. She’s deeply introverted and hates being in the limelight. She’d joined Facebook in 2018 after the financial strain of living on part-time contract work in the Bay Area had worn her down. When she received Facebook’s offer, she was upfront with her recruiter: she didn’t think the company was making the world better, but she would join to help fix it.

“They told me, ‘You’d be surprised how many people at Facebook say that,’” she remembers.

It was easier said than done. Like many new hires, she’d joined without being assigned to a specific team. She wanted to work on election integrity, which looks for ways to mitigate election-related platform abuse, but her skills didn’t match their openings. She settled for a new team tackling fake engagement instead.

Fake engagement refers to things such as likes, shares, and comments that have been bought or otherwise inauthentically generated on the platform. The new team focused more narrowly on so-called “scripted inauthentic activity”—fake likes and shares produced by automated bots and used to drive up someone’s popularity.

In the vast majority of such cases, people were merely obtaining likes for vanity. But half a year in, Zhang intuited that politicians could do the same things to increase their influence and reach on the platform. It didn’t take long for her to find examples in Brazil and India, which were both preparing for general elections.

In the process of searching for scripted activity, she also found something far more worrying. The administrator for the Facebook page of the Honduran president, Juan Orlando Hernández, had created hundreds of pages with fake names and profile pictures to look just like users—and was using them to flood the president’s posts with likes, comments, and shares. (Facebook bars users from making multiple profiles but doesn’t apply the same restriction to pages, which are usually meant for businesses and public figures.)

The activity didn’t count as scripted, but the effect was the same. Not only could it mislead the casual observer into believing Hernández was more well-liked and popular than he was, but it was also boosting his posts higher up in people’s newsfeeds. For a politician whose 2017 reelection victory was widely believed to be fraudulent, the brazenness—and implications—were alarming.

“Everyone agreed that it was terrible. No one could agree who should be responsible, or even what should be done.”

But when Zhang raised the issue, she says, she received a lukewarm reception. The pages integrity team, which handles abuse of and on Facebook pages, wouldn’t block the mass manufacture of pages to look like users. The newsfeed integrity team, which tries to improve the quality of what appears in users’ newsfeeds, wouldn’t remove the fake likes and comments from the ranking algorithm’s consideration. “Everyone agreed that it was terrible,” Zhang says. “No one could agree who should be responsible, or even what should be done.”

After Zhang applied pressure for a year, the network of fake pages was finally removed. A few months later, Facebook created a new “inauthentic behavior policy” to ban fake pages masquerading as users. But this policy change didn’t address a more fundamental problem: no one was being asked to enforce it.

So Zhang took the initiative herself. When she wasn’t working to scrub away vanity likes, she diligently combed through streams of data, searching for the use of fake pages, fake accounts, and other forms of coordinated fake activity on politicians’ pages. She found cases in dozens of countries, most egregiously in Azerbaijan, where the pages technique was being used to harass the opposition.

But finding and flagging new cases wasn’t enough. Zhang found that in order to get any networks of fake pages or accounts removed, she had to persistently lobby the relevant teams. In countries where such activity posed little PR risk to the company, enforcement could be put off repeatedly. (Facebook disputes this characterization.) The responsibility weighed on her heavily. Was it more important to push for a case in Bolivia, with a population of 11.6 million, or in Rajasthan, India, with a population close to 70 million?

Then, in the fall of 2019, weeks of deadly civil protest broke out in Bolivia after the public contested the results of its presidential election. Only a few weeks earlier, Zhang had indeed deprioritized the country to take care of what seemed like more urgent cases. The news consumed her with guilt. Intellectually, she knew there was no way to draw a direct connection between her decision and the events. The fake engagement had been so minor that the effect was likely negligible. But psychologically and emotionally, it didn’t matter. “That’s when I started losing sleep,” she says.

Whereas someone else might have chosen to leave such a taxing job or perhaps absolve herself of responsibility as a means of coping, Zhang leaned in, at great personal cost, in an attempt to singlehandedly right a wrong.

Over the year between the events in Bolivia and her firing, the exertion sent her health into sharp decline. She already suffered from anxiety and depression, but it grew significantly—and dangerously—worse. Always a voracious reader of world news, she could no longer distance herself from the political turmoil in other countries. The pressure pushed her away from friends and loved ones. She grew increasingly isolated and broke up with her girlfriend. She upped her anxiety and antidepressant medication until her dose had increased sixfold.

For Zhang, the explanation of why she cared so much is tied up in her identity. She grew up in Ann Arbor, Michigan, the daughter of parents who’d immigrated from mainland China. From an early age, she was held to high academic standards and proved a precocious scholar. At six or seven, she read an introductory physics book and grew fascinated by the building blocks of the universe. Her passion would lead her to study cosmology at the University of Michigan, where she published two research papers, one as a single author.

“She was blazing smart. She may be the smartest undergrad student I’ve ever worked with,” recalls Dragan Huterer, her undergraduate advisor. “I would say she was more advanced than a graduate student.”

But her childhood was also marked by severe trauma. As early as five years old, she began to realize she was different. She read a children’s book about a boy whose friends told him that if he kissed his elbow he would turn into a girl. “I spent a long time after that trying to kiss my elbow,” she says.

She did her best to hide it, understanding that her parents would find her transgender identity intolerable. But she vividly remembers the moment her father found out. It was spring of eighth grade. It had just rained. And she was cowering in the bathroom, contemplating whether to jump out the window, as he beat down the door.

In the end, she chose not to jump and let him hit her until she was bloody, she says: “Ultimately, I decided that I was the person who stayed in imperfect situations to try and fix them.” The next day, she wore a long-sleeved shirt to cover up the bruises and prepared an excuse in case a teacher noticed. None did, she says.

(When reached by email, her father denied the allegations. “I am sad that she alleges that I beat her as a child after I discovered her transgender identity, which is completely false,” he wrote. But multiple people who knew Zhang from high school to the present day have corroborated her account of her father’s abusive behavior.)

“To give up on them and abandon them would be a betrayal of the very core of my identity.”

In college, she decided to transition, after which her father disowned her. But she soon discovered that finally being perceived correctly as a woman came with its own consequences. “I knew precisely how people treated me when they thought that I was a dude. It was very different,” she says.

After being accepted to all the top PhD programs for physics, she chose to attend Princeton University. During orientation, the person giving a tour of the machine shop repeatedly singled her out in front of the group with false assumptions that she was less than competent. “It was my official introduction to Princeton, and a very appropriate one,” she says.

Sophie Zhang
CHRISTIE HEMM KLOK

From there, the sexism only got worse. Almost immediately, a male grad student began to stalk and sexually harass her. To cope, she picked a thesis advisor in the biophysics department, which allowed her to escape her harasser by conducting research in another building. The trouble was she wasn’t actually interested in biophysics. And whether for this or other reasons, her interest in physics slowly dissolved.

Three years in, deeply unhappy, she decided to leave the program, though not without finally reporting the harassment to the university. “They were like, ‘It’s your word against his.’ You can probably guess now why I extensively documented everything I gave to Julia,” she says, referring to Julia Carrie Wong at the Guardian. “I didn’t want to be in another ‘He said/she said’ situation.”

(A Princeton spokesperson said he was unable to comment on individual situations but stated the university’s commitment to “providing an inclusive and welcoming educational and working environment.” “Princeton seeks to support any member of the campus community who has experienced sexual misconduct, including sexual harassment,” he said.)

“What these experiences have in common is the fact that I’ve experienced repeatedly falling through the cracks of responsibility,” Zhang wrote in her memo. “I never received the support from the authority figures I needed … In each case, they completed the letter of their duty but failed the spirit, and I paid the price of their decisions.”

“Perhaps then you can understand why this was so personal for myself from the very start, why I fought so hard to keep the people of Honduras and Azerbaijan from slipping through those cracks,” she wrote. “To give up on them and abandon them would be a betrayal of the very core of my identity.”

It was during the start of her physical and mental decline in the fall of 2019 that Zhang began thinking about whether to come forward. She wanted to give Facebook’s official systems a chance to work. But she worried about being a single point of failure. “What if I got hit by a bus the next day?” she says. She needed someone else to have access to the same information.

By coincidence, she received an email from a journalist in her inbox. Wong, then a senior tech reporter at the Guardian, had been messaging Facebook employees looking to cultivate new sources. Zhang took the chance and agreed to meet for an off-the-record conversation. That day, she dropped her company-issued phone and computer off at a former housemate’s place as a precaution, knowing that Facebook had the ability to track her location. When she returned, she looked a bit relieved, the former housemate, Ness Io Kain, remembers: “You could tell that she felt like she’d accomplished something. It’s pretty silent, but it’s definitely palpable.”

For a moment things at Facebook seemed to make progress. She saw the change in policy and the takedown of the Honduran president’s fake network as forward momentum. She was called upon repeatedly to help handle emergencies and praised for her work, which she was told was valued and important.

But despite her repeated attempts to push for more resources, leadership cited different priorities. They also dismissed Zhang’s suggestions for a more sustainable solution, such as suspending or otherwise penalizing politicians who were repeat offenders. It left her to face a never-ending firehose: The manipulation networks she took down quickly came back, often only hours or days later. “It increasingly felt like I was trying to empty the ocean with a colander,” she says.

“I have never hated my autism more than when I joined Facebook.”

Then, in January of 2020, the tide turned. Both her manager and manager’s manager told her to stop her political work and stick to her assigned job. If she didn’t, her services at the company would no longer be needed, she remembers the latter saying. But without a team assigned to continue her work, Zhang kept doing some in secret.

As the pressure mounted and her health worsened, Zhang realized she would ultimately need to leave. She made a plan to depart after the US election, considering it the last and most important event she needed to deal with. But leadership had other plans. In August, she was informed that she would be fired for poor performance.

On her last day, hours after she posted her memo internally, Facebook deleted it (though they later restored an edited version after widespread employee anger). A few hours later, an HR person called her, asking her to also remove a password-protected copy she had posted on her personal website. She tried to bargain: she would do so if they restored the internal version. The next day, instead, she received a notice from her hosting server that it had taken down her entire website after a complaint from Facebook. A few days after that, it took down her domain as well.

Even after all that Facebook put her through, Zhang defaults to blaming herself. In her memo, she apologized to colleagues for any trouble she might have caused them and for leaving them without achieving more. In a Reddit AMA months later, she apologized to the citizens of different countries for not acting fast enough and for failing to reach a long-term solution.

To me, Zhang, who is autistic, wonders aloud what she could have accomplished if she were not. “I have no talent for persuasion and convincing,” she says. “If I were someone born with a silver tongue, perhaps I could have made changes.”

“I have never hated my autism more than when I joined Facebook.”

In preparation for going public, Zhang made one final sacrifice: to conceal her trans identity, not for fear of harassment, but for fear that it would distract from her message. In the US, where transgender rights are highly politicized, she didn’t want protecting democracy to become a partisan issue. Abroad, where some countries treat being transgender as a crime punishable by prison time or even death, she didn’t want people to stop listening.

It was in keeping with a sacrifice she’d repeatedly made when policing election interference globally. She treated all politicians equally, even when removing the fake activity of one in Azerbaijan inevitably boosted an opponent who espoused homophobia. “I did my best to protect democracy and rule of law globally for people, regardless of whether they believed me to be human,” she says with a deep sigh. “But I don’t think anyone should have to make that choice.”

The night the Guardian articles were published, she anxiously awaited the public reaction, worried about whether she’d be able to handle the media attention. “I think she actually surprised herself at how good she was in interviews,” says her girlfriend, Lisa Danz, whom Zhang got together with after leaving Facebook. “She found that when there’s material that she knows very well and she’s just getting asked questions about it, she can answer.”

The impact ultimately fell short of what Zhang envisioned. Several media outlets in the US did follow-up pieces, as did foreign outlets from countries affected by the manipulation activity. But as far as she’s aware, it didn’t achieve what she’d been hoping for: a big enough PR scandal to make Facebook finally prioritize the work she left behind.

Facebook once again disputes this characterization, saying the fake-engagement team has continued Zhang’s work. But Zhang points to other evidence: the network of fake pages in Azerbaijan is still there. “It’s clear they haven’t been successful,” she says.

Nonetheless, Zhang doesn’t regret her decision to come forward. “I was the only one in this position of responsibility from the start,” she says, “and someone had to take the responsibility and do the utmost to protect people.”

Without skipping a beat, she then rattles off the consequences that others have faced for going up against the powerful in more hostile countries: journalists being murdered for investigating government corruption, protesters being gunned down for registering their dissent.

“Compared to them, I’m small potatoes,” she says.

Correction: A previous version of the article missed an "e" on Joe Osborne. Sorry, Joe.

Keep Reading

Most Popular

Scientists are finding signals of long covid in blood. They could lead to new treatments.

Faults in a certain part of the immune system might be at the root of some long covid cases, new research suggests.

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.