MIT Technology Review Subscribe

The Changing Nature of Privacy on Facebook

Microsoft’s Danah Boyd on social networking.

Earlier this month, Facebook sought to increase its reach by connecting with other sites across the Web. The Open Graph Protocol, announced at Facebook’s f8 Developers Conference, makes it easier for outside sites to share information with Facebook when visitors want to recommend a page. But Facebook has come under increasing scrutiny for making users’ data more public and available to search engines and for making changes to the terms of its privacy policy, which some users have been unaware of.

Few have been as vocal about Facebook’s actions as Danah Boyd, a social media researcher at Microsoft Research New England. More generally, she has called for Web companies to take more responsibility for how they handle users’ personal information. Technology Review’s assistant editor, Erica Naone, recently talked with Boyd about how to think about Facebook’s latest moves.

Advertisement

Technology Review: Why is it so hard to keep up with the way Facebook works?

This story is only available to subscribers.

Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.

Subscribe now Already a subscriber? Sign in
You’ve read all your free stories.

MIT Technology Review provides an intelligent and independent filter for the flood of information about technology.

Subscribe now Already a subscriber? Sign in

Danah Boyd: People started out with a sense that this is just for you and people in your college. Since then, it’s become just for you and all your friends. It slowly opened up and in the process people lost a lot of awareness of what was happening with their data. This is one of the things that frightens me. I started asking all of these nontechnological people about their Facebook privacy settings, and consistently found that their mental model of their privacy settings and what they saw in their data did not match.

TR: What’s been driving these changes for Facebook?

DB: When you think about Facebook, the market has very specific incentives: Encourage people to be public, increase ad revenue. All sorts of other things will happen from there. The technology makes it very easy to make people be as visible and searchable as possible. Technology is very, very aligned with the market.

TR: Some people dismiss concerns about this sort of situation by saying that privacy is dead.

DB: Facebook is saying, “Ah, the social norms have changed. We don’t have to pay attention to people’s privacy concerns, that’s just old fuddy-duddies.” Part of that is strategic. Law follows social norms.

TR: What do you think is actually happening to the social norms?

DB: I think the social norms have not changed. I think they’re being battered by the way the market forces are operating at this point. I think the market is pushing people in a direction that has huge consequences, especially for those who are marginalized.

Advertisement

TR: A lot of people wonder why it matters if companies share personal data. How are people affected by privacy violations?

DB: The easiest one to explain is the case of teachers. They have a role to play during the school day and there are times and places where they have lives that are not student-appropriate. Online, it becomes a different story. Facebook has now made it so that you can go and see everybody’s friends regardless of how private your profile is. And the teachers are constantly struggling with the fact that, no matter how obsessively they’ve tried to make their profiles as private as possible, one of their friends can post a photo from when they were 16 and drinking or doing something else stupid, and all of a sudden, kids bring it into school. We want teachers to be able to have a teacher relationship to our kids that is different from what the teacher has to their intimates. Yet the technology puts the teacher constantly at risk.

TR: What can users do about this kind of thing?

DB: I think that the voices need to start speaking up. They have with Facebook historically, and I think that’s the really interesting thing. Users have taken issue when the rules changed and the company gave no warning.

TR: But does it matter if users speak up?

DB: It’s different for different cases. [Facebook’s failed advertising platform] Beacon didn’t have the outcome you might have expected. Users said, “Oh my God, what is this? This is horrible.” And a class action suit ensued. That did not result in the service eventually being accepted.

TR: What sort of regulation would be helpful?

DB: If you’re going to change the privacy settings, the default should always be what the users originally chose, and you have to opt into changes. Period. End of story.

Advertisement

TR: What could Facebook do that would convince you they’d changed their ways?

DB: They need a set of actions that show that they’re paying attention. If they actually care about making certain that people have a real model of understanding about their privacy, the best thing they could do is have every post that they put up there show all the people who can actually see it or show how many people can see it. If you see something that is visible to 10 million people, you might think twice about what the heck you did with your privacy settings.

This is your last free story.
Sign in Subscribe now

Your daily newsletter about what’s up in emerging technology from MIT Technology Review.

Please, enter a valid email.
Privacy Policy
Submitting...
There was an error submitting the request.
Thanks for signing up!

Our most popular stories

Advertisement