Last month, the primary source for the Wall Street Journal’s Facebook Files, revealed her identity in an episode of 60 Minutes. Frances Haugen, a former product manager at the company, says she came forward after she saw Facebook’s leadership repeatedly prioritize profit over safety. She then appeared before lawmakers in the US and the UK to talk about what she'd learned during her time at the firm.
But Haugen was not the first Facebook whistleblower to raise the alarm over the firm's inability—or unwillingness— to deal with serious problems caused by the platform's algorithms. In 2020 we learned about Sophie Zhang, who had been officially employed as a low-level data scientist at the company. When she quit, her blockbuster, 8,000 word exit memo memo revealed that she’d identified dozens of countries, including India, Mexico, Afghanistan, and South Korea, where fake accounts and likes were allowing politicians to mislead the public and gain power. It also revealed how little the company had done to mitigate the problem, despite Zhang’s repeated efforts to bring it to the attention of leadership.
In the latest episode of I Was There When, a new oral history project from the In Machines We Trust podcast, we speak to Zhang about her efforts.
This episode was produced by Jennifer Strong, Anthony Green and Emma Cillekens. It’s edited by Michael Reilly and Mat Honan. It’s mixed by Garret Lang, with sound design and music by Jacob Gorski.
SOT: Frances Haugen: During my time at Facebook, I came to realize a devastating truth. Almost no one outside of Facebook knows what happens inside of Facebook.
Jennifer: Frances Haugen is a former product manager at Facebook. She’s filed complaints with federal law enforcement claiming the social media giant’s leadership has repeatedly put profit over safety.
SOT: Frances Haugen: The company intentionally hides vital information from the public, from the U.S. government, and from governments around the world.
Jennifer: Her complaints came with a trove of documents that Haugen gathered before quitting... in an attempt to demonstrate the company had willfully chosen not to fix the problems on its platform.
Among them... that algorithms like the one behind your Facebook newsfeed amplify hate, misinformation and political unrest.
And she’s not the only whistleblower accusing the company of turning a blind eye to disinformation campaigns on the platform.
Sophie Zhang worked as a Facebook data scientist… and up until she was fired, she consumed herself with finding and taking down fake accounts, comments and likes that were being used to sway elections globally. Her blockbuster exit memo was 8-thousand words… and it revealed just how little Facebook had done to mitigate the problem.
I’m Jennifer Strong, and this is I Was There When.
It’s an oral history project featuring the stories of how watershed moments in artificial intelligence and computing happened… as told by the people who witnessed them.
Sophie Zhang: I'm Sophie Zhang. At Facebook I was a data scientist and I worked on the engagement team. By fake I mean, for instance, fake accounts, but also hacked accounts. And by engagement, I mean, likes, comments, shares, et cetera. But in my spare time, when I began moonlighting in the area of finding inauthentic political activity. Often very sophisticated, inauthentic political activity. I could argue that this was also fake engagement, but it was not what I was expected to work on.
When I began to look into the intersections between fake engagement and civic activity, very quickly found results worldwide. I found results in, in Brazil, in India, in Indonesia, in many nations, but also in Honduras, quite a bit in Honduras actually relative to its size. And I was putting together a report for leadership on the problem. I meant to take a screenshot of the page of the major recipient in Honduras, Juan Orlando Hernández, who after I looked him up turned out to be the president of Honduras.
I was going to his page to take a screenshot when they suddenly stopped, because I noticed something very unusual because the people who were, who were liking his page and commenting, many of them were not people at all. They were pages pretending to be users. And so just to step back here, what are pages? What are users? Pages are a Facebook feature meant for public figures, public organizations, etc. So for instance, MIT Tech Review has a Facebook page. MIT Tech Review is not a person. The page is ran by someone else on Facebook.
And so the intent of how Facebook pages are, is that they're supposed to reflect public entities and a single user can control many pages. So for instance, the same administrator can control the CNN page as well as CNN, Philippines, and CNN Europe, etcetera. But there was nothing preventing a user from setting up hundreds of pages, giving them names and profile pictures like real people and having them act as real people. In fact, it was easier on their end because they could quite easily switch between these pages without needing to log in and log off every time.
Quite quickly realized that these fake pages pretending to be users, there was thousands of them in Honduras. And a few hundred of them were personally run by the page administrator of the president of Honduras. This was someone who clearly had a significant amount of trust on social media in the Honduran government. And they were not even hiding the fact that they were using thousands of fake assets to manipulate their own citizenry.
And so from the start, I was very naive. I thought that, okay, I found this. These people were stupid. We caught them. I will hand them over to others. They'll take over it. I can cut back to my actual job and everything will be fine. It was instead the start of a two year sisyphean ordeal, because what happened when I raised it was everyone agreed that this was terrible. It was not controversial that this was bad. Everyone agreed that this should not be allowed, but the question was, what do we do about it? Do we have the ability to act on it? Is it within our policies to act on it, et cetera. From the start, I spoke about it to everyone who seemed to relate to it. I spoke about it to Pages Integrity, to Groups Integrity. I talked about it to the civic integrity team. I tried to get threat intelligence interested. Eventually I was talking to everyone who would listen. I spoke to everyone up to and including vice president Guy Rosen. Essentially it was like talking to the wind, like trying to empty the ocean with a colander. Eventually I figured out that the best way to get results was often not by trying to go through the proper channels, but by complaining publicly within the company on Workplace, which is essentially Facebook for the office, and making Workplace posts about the situation that others at the company could see. And many of them were of course upset about it because this was not the sort of company they thought they were working for and wanted to be working for. I don't know whether it was by this means, by others, but eventually Facebook finally took down the operation of the Hondurian government, which was international news.
A few weeks later, the operation came back using a different method. When we did get the take down in Honduras, I was still naive and idealistic at the start. I thought, okay, before this everyone was saying we didn't have a precedent, we didn't know what to do, but now we do have the precedent. Now we can say, we have created this precedent of doing this. We can solve it in this method. And so after this I thought, okay, I can send them over all the others that I found. They would take care of it. They did not take care of it. When I used the proper channels to send them over, they went essentially into a black box shredder where they were ignored.
In the second half of 2019, I raised and flagged about three dozen more networks of inauthentic political activities from Afghanistan to Albania, from Brazil, to Bolivia, from India to Indonesia. And so it took a while to figure out the right way to actually get results and get people to actually respond. It depended on whether they were tied to politicians and prominent figures, because if something was tied to a political figure, it became much harder to take it down. I'm going to give you a number now that I did not leave when they first saw it. Because Azerbaijan certainly has less than 3% of the world’s population, much smaller, its network was creating perhaps a million comments every month. And this constituted something like 3% of all comments by Pages on posts by other Pages worldwide, globally, civic or non-civic.
But even when I caught the Azerbajiani government red-handed, it took more than a year for the operation to be taken down. When I found that the Azerbaijan government had set up a massive toll farm of paid operatives to harass the opposition in large volumes. This was very clearly bad. It was very clearly tied to the Azerbajiani government. And it was massive in scale. But when we took it down, I'm sure somewhere at Facebook, there was a team that was very upset about why the numbers suddenly dropped for no reason that they could figure out. between my discovery and the take down Azerbaijan had cracked down on the opposition, arrested unimproved opposition figures, and started a war with Armenia.
It became more and more stressful to work on the issue. It's hard to, individually, to see what impact any of this had, but it quickly became clear that it was tied to activity in nations that were struggling because there was so much going on. And I was the one personally making decisions about what was important. It was essentially entirely up to me what I chose to go at further, what I chose to prioritize and try to get attention for. And I chose not to prioritize Bolivia because it was objectively very small and not very smart. Well after the election, there were mass protests that escalated into what has been called alternatively a coup d'etat or a popular uprising that resulted in the fall of the Bolivian government. I know that this should not have been personally my responsibility, but at the end of the day, there was no one else who stepped up. And so I chose to do it myself. And because I had put myself in this position, it was essentially up to myself, what was important enough to focus on, and I want to be clear. There were always others who were, who were in charge of refine my findings and in charge of actually taking it down at the end.
I decided from the start that I would only be the prosecutor, essentially, I would try my best to never be judge, jury and executioner, because I already had too much power in my hands. I don't think anyone should be in the position of deciding if Albania is more important than Azerbaijan or questions like that. Because they also found in that book of accounts that were tied to supporting members of the Albanian government. But what I found in Azerbaijan was objectively worse in terms of size and scale. And so I knew I only had the political capital to very slowly push through one at a time. And so I chose to focus on Azerbaijan. It's still going on in Albania. Albania had general elections earlier this year, and it was still going on at the time. I mean, more than two years after I discovered it, Facebook still hasn't done anything.
And I can only apologize profusely to the Albanian people. I should not have been in a position in which I needed to choose was Albania or Azerbaijan more important. I stand by my decision because what they found in Azerbaijan was objectively worse, but still, no single person should be in charge of questions like this. There have been many news reports about how Facebook is under-resourced in the area of integrity. I haven't seen any news reports complaining that Facebook has too little resources in ads marketing. And I think that states volumes about the company's priorities at the end.
What I found most difficult was in certain authoritarian countries. The democratic opposition was benefiting from employing and savory tactics. And I had the most thought over those cases. But I still took them down without hesitation because I believed very firmly that my allegiance was to the ideas of democracy and the rule of law. And that fundamentally democracy cannot stand on a bed of lies. As a very low level employee in my own spare time, without any oversight whatsoever was making decisions personally, that directly affected national governments.
It's a hard question to answer what I believe should happen, because part of it is like, it's like asking if you could make the sky any color, what color would you like it to be? Because it's a, it's a mostly theoretical question and your answer will have no real have no actual effect on the real world.
I can't make Mark change his mind anymore than they can make the sky pink overnight. This is up to the people listening to me, to people like yourself, because I can't change anything myself. I only have as much power as others grant me. If you want things to change, you should be personally asking your representatives because ultimately this is a problem that isn't experienced by a few single people. This is the problem in which the costs are borne by society, by democracy, by civic discourse as a whole. And as a company, Facebook has no incentive to fix this anymore than we expected Philip Morris to develop non addictive cigarettes.
Jennifer: I Was There When is an oral history project featuring the stories of people who have witnessed or created breakthroughs in artificial intelligence and computing.
Do you have a story to tell? Know someone who does? Drop us an email at podcasts at technology review dot com.
Jennifer: This episode was produced by me with help from Anthony Green and Emma Cillekens. We’re edited by Niall Firth and Mat Honan. Our mix engineer is Garret Lang… and our theme music is by Jacob Gorski.
Thanks for listening, I’m Jennifer Strong.
This artist is dominating AI-generated art. And he’s not happy about it.
Greg Rutkowski is a more popular prompt than Picasso.
What does GPT-3 “know” about me?
Large language models are trained on troves of personal data hoovered from the internet. So I wanted to know: What does it have on me?
An AI that can design new proteins could help unlock new cures and materials
The machine-learning tool could help researchers discover entirely new proteins not yet known to science.
DeepMind’s new chatbot uses Google searches plus humans to give better answers
The lab trained a chatbot to learn from human feedback and search the internet for information to support its claims.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.