A few weeks ago, a friend wrote to me with a problem. He said his daughter’s name–let’s call her Alice Haynes–was mistakenly appearing on the Internet as a member of a bowling group on the social-networking site Meetup. Because I’m on Meetup’s board, he asked me to get her name removed. I checked on it; as far as I could tell, the Alice Haynes in question was not his daughter, but some other Alice Haynes in another city.
The episode was a small example of how issues of online identity and privacy are changing. In the old days, the issue was keeping your data secret. Now, the challenge is making sure your data isn’t mixed up with someone else’s, and controlling it as it spreads out over the Web. This means managing and curating it.
Your presence on the Web is increasingly distributed. And your data is not yours alone; it also belongs to the merchant who sold you that red sweater (size 12), to Juan who took the photo of you on the beach, and to Susan who said things about you. Should I have the right to control what another person says about me? If I am a Yankees fan, and you have given some vendor permission to track you and advertise Red Sox gear to you, should I have no control over the fact that you may see Red Sox ads when you visit my Facebook page? If some other person with my name does something embarrassing, how can I keep my identity separate? (For example, do you want everyone to have some kind of unique ID, or does that idea terrify you?)
All these questions reflect a new dimension of privacy: users’ ability to control their self-presentation. The difficulty of doing this intensifies as advertisers and website owners try to make money from user-generated content.
Joint rights–in this case, those of the individual and the platform owner to information or to presentation–invariably lead to tensions, trade-offs, and conflict. General principles of how to accommodate both owners are useful, but individuals have differing interests and sensitivities. Satisfying them requires contracts, ideally in the form of easily checked-off permissions and restrictions.
Over time, vendors and users together will develop tools and practices to deal with these questions. But current website “privacy” policies don’t suffice. They’re full of abstractions, euphemisms, and generalities, such as, “We may, at any point in time, provide certain Specified Information to selected Marketing Partners … .” Why not list for the user the same specific information that’s being sold to those “marketing partners”–user name, address, credit history, purchasing behavior, and so on? And then list, say, the top 10 marketing partners, and offer the full searchable list on request? Or allow the user to decide which advertisers may “sponsor” her presence on that site? All these options would allow users to make informed choices.
Esther Dyson is an Investor in and board member of 23andme, Boxbe, meetup, wpp group, and yandex, among other companies.
DeepMind’s cofounder: Generative AI is just a phase. What’s next is interactive AI.
“This is a profound moment in the history of technology,” says Mustafa Suleyman.
What to know about this autumn’s covid vaccines
New variants will pose a challenge, but early signs suggest the shots will still boost antibody responses.
Human-plus-AI solutions mitigate security threats
With the right human oversight, emerging technologies like artificial intelligence can help keep business and customer data secure
Next slide, please: A brief history of the corporate presentation
From million-dollar slide shows to Steve Jobs’s introduction of the iPhone, a bit of show business never hurt plain old business.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.