Facebook’s power over our social lives comes with great responsibility.
Dystopias in the 20th century came mostly in two flavors: the imposing surveillance state, as depicted in Orwell’s 1984, and the stupefying pleasure dome, as in Huxley’s Brave New World. What if these were not separate nightmares? The real threat could be surveillance in the service of seduction rather than punishment.
Facebook is so successful because it helps us fulfill the urge to remain connected to one another. That urge has led about a billion of us to provide a single company with imprints of a sizable part of our social lives (see “What Facebook Knows”).
As a social scientist, I’m terribly excited by this data trove, because it’s a great resource for studying the human animal. Emerging “big data” sources—Facebook is one of the best—have the potential to contribute to our understanding of society. But this information has uses that go beyond targeting ads. Though the prospect doesn’t seem to faze Facebook users so far, it could be used to target civic, political, and social messaging in ways that are unhealthy for democracy.
Political campaigns, for example, now encourage voters to connect with their Facebook apps or pages, which can access in-depth data about not just a person but his or her social networks and interactions. This creates opportunities for profiling at unprecedented precision and scale. I am waiting for the first wave of vicious negative political campaigning on Facebook. (Most of us might not even notice it, since it could be narrowly targeted to a receptive niche, or even to individuals.)
The way Facebook uses its collected data can influence our social interactions. Facebook’s news feed does not show all updates, or even the most recent, but rather what Facebook thinks will make you click or comment. News-feed algorithms create spirals of reinforcement for certain behaviors. The details are secret, but reverse-engineering shows that Facebook thinks photos generate more engagement than text; updates with them are featured more prominently, leading to even more engagement. Even without explicit instruction, people will undoubtedly pick up on these cues and start posting ever more photos. In other ways, too, Facebook may be guiding how we socialize. Perhaps cheery posts get more prominence. Perhaps we will one day learn of a suicide following a brief, cryptic status update that no one responded to because Facebook downgraded it on people’s news feed upon judging it not to be the kind of post that generates clicks.
I do not claim there are easy answers to the questions Facebook raises. Any algorithm for the news feed would have its downside. But the questions are important because Facebook occupies an important civic niche. Decisions about how it uses its hoard of data, what it makes public, and how much access it gives political and corporate campaigns will affect us all. We need to be talking not just about the potential of this awesome data store but about the power the company has and the ethics it upholds.
Zeynep Tufekci is an assistant professor at the University of North Carolina and a fellow at the Berkman Center for Internet and Society at Harvard.