President Donald Trump’s threat last week to overturn Section 230—the free speech law that shields social-media platforms from liability for what their users post—may have been empty and unworkable. But the uproar about it, sparked by Twitter’s decision to label two of the president’s tweets as misinformation, immediately tapped into two of the left’s and the right’s favorite stereotypes: a stubborn, vindictive president who can’t keep his hands off the keyboard, and a mass of wild liberals in Silicon Valley who consistently oppose his every move.
The narratives approved by these companies suggest such that decisions are carefully made, hand-wringing affairs that try to walk a policy tightrope. Whether or not to, say, fact-check the president’s tweets on mail-in voting happens only with a great deal of consternation and reflection. Yesterday, in fact, Facebook CEO Mark Zuckerberg told employees that his stance on Trump was a “tough decision” that was “pretty thorough.”
It’s easy to take this statement at face value. These companies are complicated, sprawling operations, after all—so surely these decisions are complicated too? The businesses have billions of users and many thousands of employees, which often gives them the feel of a bustling, if often infuriating, democracy, or at the very least a functioning bureaucracy.
In reality, though, things are far less complicated, and far more personal, than they appear. Social-media firms might seem new, with young CEOs, but behind the scenes they are in fact very similar to the old-fashioned movie studios and media companies whose businesses they stepped over on the way to success: vast fiefdoms, tightly controlled by their leaders.
Stay in control
Take Facebook. Zuckerberg has a remarkable amount of control over his business. Today he owns around 14% of the company, but nearly 60% of its voting shares. This makes him almost invulnerable. Last year, at Facebook’s annual shareholder meeting, 68% of independent investors voted to remove him as chairman and replace him with an outsider; Zuckerberg instead voted himself back into office.
From the very beginning, most things at the company have been done the way he wants them. To emphasize his influence, every single Facebook page until 2007 carried the text that it was “A Mark Zuckerberg Production.” A decade later, when he was accused of having a negligent attitude towards political manipulation, he responded by pushing aside many of his lieutenants and saying he was done with being a “peacetime leader.”
This negligence came about partly because while he obsesses over many levels of the product, he prefers to be hands-off in the areas that aren’t exciting to him—instead requiring systems that can be perfected to operate without requiring his oversight.
That’s why, to counter Zuckerberg’s lack of interest in content moderation, the company painstakingly developed a rulebook over many years that is intended to produce the appearance of logic. But while the company aims to operate a system of laws (it has even introduced an “oversight board,” its own Supreme Court in the making), it has in fact created a universe of inconsistently applied rules. The result is a place where nipples can be banned while calls for genocide are being super-charged; where US politicians who lie in campaign ads are treated as if they were a protected class while foreign leaders are booted off the platform unceremoniously.
This is in part because, as Mark Zuckerberg’s controversial, secretive call to the Oval Office on May 31 made clear, the rules apply only as long as the CEO wants them to.
As Kara Swisher, a longtime observer of Silicon Valley, put it on CNBC: “It’s not Mark and his minions, it’s just Mark. He has a passing knowledge of the First Amendment … but he’s made his decision, and his decision is the rule of law at Facebook, so that’s what they’re doing.”
At Twitter, too, CEO Jack Dorsey’s influence reigns—although in a very different way.
The company’s approach to content moderation has also been wildly inconsistent over the years, but in a way that reflects not an unattainable desire for rule of law, but Dorsey’s inability to know what he wants.
Twitter has never veered away from politics in the same way as Facebook. It once delayed scheduled downtime at the request of the Obama White House to help foment a potential revolution in Iran. And Dorsey famously courted Black Lives Matter leaders in the wake of the Ferguson protests in 2014, making public appearances in a “#staywoke” T-shirt. Yet he also dances with the right: telling conservative podcaster Joe Rogan that the site has been too harsh on right-wing users, dissembling over violent threats and abuse on the platform, and explaining why Twitter wasn’t banning conspiracy theorist Alex Jones shortly before it, well, banned him.
But unlike Facebook, which has never achieved consistency because it has found the world an illogical and confusing place, Twitter never really sought consistency in the first place—just attention and growth. While the company may no longer stand by its infamous claim that it was “the free speech wing of the free speech party,” it is clear that being seen, sparking reactions, and letting people’s ideas run free was integral to the company’s success.
“I don’t think it’s inaccurate to say we were optimizing for freedom of expression,” cofounder Evan Williams once told me. “A lot of things that people think Twitter could easily do to curb bad actors—the reason they don’t is because to some extent the company still sees that as a big part of its role.”
That’s why Twitter can decry abuse while simultaneously building tools that amplify misinformation. Or why Dorsey has been clear in the past that Trump is able to flout the rules because he makes news (it’s a policy ouroboros the company calls its “public interest exception,” but it essentially means that the more important you are, the less accountable you become.)
Dorsey did take some form of responsibility for Twitter’s spat with the White House.
But he is not the one facing the heat for Twitter’s suddenly growing something resembling a spine. The folks bearing the brunt of his indecision are his employees, who now face death threats for doing their jobs.
All businesses have founders, leaders, and decision makers. They are entitled to make their own decisions, as long as they’re legal. But the law also requires that public companies do not act as personal kingdoms: they are beholden to their shareholders. For now, those investors are more interested in money than accountability—but because Silicon Valley likes to hand equity to its workers, many shareholders are the same employees who are affected by these arbitrary decisions. And they are starting to push back with public protests and internal dissent.
But accountability is hard to achieve in a monarchy. Though Zuckerberg and Dorsey say the buck stops with them, in fact they remain inoculated from their bad decisions. Zuckerberg could change, but his level of control means nobody can force him to. Dorsey’s leadership is so unchallenged that—even though Twitter’s stock price has rarely matched the levels it held when he took the reins in 2015—he can remain a part-time CEO without fear of being deposed. How are these rulers proposing to change? Who are they listening to? What counter-arguments do they hear? And why should they do anything other than trust the instincts that have made them billionaires?
All around, Americans are seeing the impact of quixotic leaders who make arbitrary, defensive choices based on instinct and self-preservation: on Facebook, on Twitter, and on the streets.
That’s why, despite the stereotypes and the blustering conflicts, Zuckerberg and Dorsey are perhaps more similar to Trump than they are different. These are businessmen who are used to getting their own way. They got very rich very early, live in gilded isolation, and have built environments around them where their word is really the only thing that matters. They make “tough decisions” that seem inconsistent or confused because, in the end, the only real consistency is that they are the ones issuing the proclamation.
Troll farms reached 140 million Americans a month on Facebook before 2020 election, internal report shows
“This is not normal. This is not healthy.”
The Facebook whistleblower says its algorithms are dangerous. Here’s why.
Frances Haugen’s testimony at the Senate hearing today raised serious questions about how Facebook’s algorithms work—and echoes many findings from our previous investigation.
She risked everything to expose Facebook. Now she’s telling her story.
Sophie Zhang, a former data scientist at Facebook, revealed that it enables global political manipulation and has done little to stop it.
Covid conspiracy theories are driving people to anti-Semitism online
Old and overtly anti-Semitic fantasies are gaining new adherents, and far-right activists have been working to convert anti-lockdown beliefs to anti-Semitism too.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.