MIT Technology Review Subscribe

Why Privacy Is Not Dead

The way privacy is encoded into software doesn’t match the way we handle it in real life.

Each time Facebook’s privacy settings change or a technology makes personal information available to new audiences, people scream foul. Each time, their cries seem to fall on deaf ears.

The reason for this disconnect is that in a computational world, privacy is often implemented through access control. Yet privacy is not simply about controlling access. It’s about understanding a social context, having a sense of how our information is passed around by others, and sharing accordingly. As social media mature, we must rethink how we encode privacy into our systems.

Advertisement

Privacy is not in opposition to speaking in public. We speak privately in public all the time. Sitting in a restaurant, we have intimate conversations knowing that the waitress may overhear. We count on what Erving Goffman called “civil inattention”: people will politely ignore us, and even if they listen they won’t join in, because doing so violates social norms. Of course, if a close friend sits at the neighboring table, everything changes. Whether an environment is public or not is beside the point. It’s the situation that matters.

This story is only available to subscribers.

Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.

Subscribe now Already a subscriber? Sign in
You’ve read all your free stories.

MIT Technology Review provides an intelligent and independent filter for the flood of information about technology.

Subscribe now Already a subscriber? Sign in

Whenever we speak in face-to-face settings, we modify our communication on the basis of cues like who’s present and how far our voices carry. We negotiate privacy explicitly–“Please don’t tell anyone”–or through tacit understanding. Sometimes, this fails. A friend might gossip behind our back or fail to understand what we thought was implied. Such incidents make us question our interpretation of the situation or the trustworthiness of the friend.

All this also applies online, but with additional complications. Digital walls do almost have ears; they listen, record, and share our messages. Before we can communicate appropriately in a social environment like Facebook or Twitter, we must develop a sense for how and what people share.

When the privacy options available to us change, we are more likely to question the system than to alter our own behavior. But such changes strain our relationships and undermine our ability to navigate broad social norms. People who can be whoever they want, wherever they want, are a privileged minority.

As social media become more embedded in everyday society, the mismatch between the rule-based privacy that software offers and the subtler, intuitive ways that humans understand the concept will increasingly cause cultural collisions and social slips. But people will not abandon social media, nor will privacy disappear. They will simply work harder to carve out a space for privacy as they understand it and to maintain control, whether by using pseudonyms or speaking in code.

Instead of forcing users to do that, why not make our social software support the way we naturally handle privacy? There is much to be said for allowing the sunlight of diversity to shine. But too much sunlight scorches the earth. Let’s create a forest, not a desert.

Danah Boyd is a social-media researcher at Microsoft Research New England, a fellow at Harvard University’s Berkman Center for Internet and Society, and a member of the 2010 TR35.

This is your last free story.
Sign in Subscribe now

Your daily newsletter about what’s up in emerging technology from MIT Technology Review.

Please, enter a valid email.
Privacy Policy
Submitting...
There was an error submitting the request.
Thanks for signing up!

Our most popular stories

Advertisement