Skip to Content

Keeping Things Private at Microsoft

The company and its rivals have important differences when it comes to protecting personal information, says its chief privacy officer.
June 11, 2012

Earlier this year, Microsoft caused a stir by running big newspaper ads charging that its archrival, Google, was trampling on personal privacy by gathering ever more information on users. Some saw the ads as disingenuous: Microsoft uses some similar practices in its own search engine, Bing.

But inside Microsoft, the claim that it is better at privacy is an article of faith. Microsoft’s efforts began in the 1990s, when it battled security holes in its Windows operating system. Back then, privacy meant not having your computer infected with a hacker’s malware. Today, it means companies don’t abuse your personal data. Journalist Lee Gomes spoke with Brendon Lynch, Microsoft’s chief privacy officer.

TR: Why did Microsoft criticize Google?

Lynch: It was recognizing that there is angst in the environment, that there is concern around privacy for a lot of consumers. We feel proud about the way we build privacy features and controls, and we wanted people to know that there was a choice out there.

Bing recently began letting users sign in with Facebook and share search results. Do you handle social media any differently?

People have to opt in to any experience where search results are shared. Also, Facebook requires users to be 13 or older to access its services, whereas Bing’s social search features will only surface results for users who are 18 or older.

What exactly is Microsoft’s philosophy on user privacy?

It’s what we call “privacy by design”; privacy should be built into our products and services from the ground up. A centerpiece of our program is the privacy review process, which enables engineers and product designers to assess the privacy implications of new products from the earliest stages of development. We have about 50 full-time privacy professionals at Microsoft, and roughly 2,000 privacy reviews are conducted each year.

What did Microsoft learn during the 1990s, when conspiracy theorists accused you of using Windows updates to spy on people?

That really helped us understand the importance of trust. Windows updates keep society protected from online threats. So we wanted to ensure that we had strong privacy controls in Windows Update, so that people would trust it and use it. One of the things we did was to have independent auditors crawl all over what we were doing and then issue a report that assured everyone we were collecting only the data we said we were collecting.

Does “privacy” mean something different to Microsoft than it did 15 years ago?

Security as it relates to data is primarily about the protection of that data, but privacy is something much broader: “what is the correct use of the data?” There was a lot more focus on security in the early days at Microsoft, but we’ve been investing deeply over the last 10 years to get us ready for this moment when privacy would become much more important. The big privacy challenge of our time will be enabling society to benefit from information-centric innovations while ensuring that personal privacy is protected.

A lot of people criticize how Web privacy is handled in the United States, in that it forces users to keep up with an endless list of privacy notices that they never really read.

It is clear that the current framework of notice and choice and consent is under some strain. It puts a lot of the burden on the individual to be able to understand what is happening and then make informed choices. The reality is that some really want to be able to read a privacy statement and have controls and make choices. But our research also tells us that the vast majority of people really just want to feel protected and want to be able to trust online.

Do devices like the Kinect game controller present new privacy challenges?

There are some privacy sensitivities—it can do voice recognition, it can do facial recognition. Protecting privacy in this case involved making sure that none of this information leaves the Kinect device. It’s not storing the information or sharing it with anything else.

Keep Reading

Most Popular

Scientists are finding signals of long covid in blood. They could lead to new treatments.

Faults in a certain part of the immune system might be at the root of some long covid cases, new research suggests.

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.