Skip to Content

Keeping Things Private at Microsoft

The company and its rivals have important differences when it comes to protecting personal information, says its chief privacy officer.
June 11, 2012

Earlier this year, Microsoft caused a stir by running big newspaper ads charging that its archrival, Google, was trampling on personal privacy by gathering ever more information on users. Some saw the ads as disingenuous: Microsoft uses some similar practices in its own search engine, Bing.

But inside Microsoft, the claim that it is better at privacy is an article of faith. Microsoft’s efforts began in the 1990s, when it battled security holes in its Windows operating system. Back then, privacy meant not having your computer infected with a hacker’s malware. Today, it means companies don’t abuse your personal data. Journalist Lee Gomes spoke with Brendon Lynch, Microsoft’s chief privacy officer.

TR: Why did Microsoft criticize Google?

Lynch: It was recognizing that there is angst in the environment, that there is concern around privacy for a lot of consumers. We feel proud about the way we build privacy features and controls, and we wanted people to know that there was a choice out there.

Bing recently began letting users sign in with Facebook and share search results. Do you handle social media any differently?

People have to opt in to any experience where search results are shared. Also, Facebook requires users to be 13 or older to access its services, whereas Bing’s social search features will only surface results for users who are 18 or older.

What exactly is Microsoft’s philosophy on user privacy?

It’s what we call “privacy by design”; privacy should be built into our products and services from the ground up. A centerpiece of our program is the privacy review process, which enables engineers and product designers to assess the privacy implications of new products from the earliest stages of development. We have about 50 full-time privacy professionals at Microsoft, and roughly 2,000 privacy reviews are conducted each year.

What did Microsoft learn during the 1990s, when conspiracy theorists accused you of using Windows updates to spy on people?

That really helped us understand the importance of trust. Windows updates keep society protected from online threats. So we wanted to ensure that we had strong privacy controls in Windows Update, so that people would trust it and use it. One of the things we did was to have independent auditors crawl all over what we were doing and then issue a report that assured everyone we were collecting only the data we said we were collecting.

Does “privacy” mean something different to Microsoft than it did 15 years ago?

Security as it relates to data is primarily about the protection of that data, but privacy is something much broader: “what is the correct use of the data?” There was a lot more focus on security in the early days at Microsoft, but we’ve been investing deeply over the last 10 years to get us ready for this moment when privacy would become much more important. The big privacy challenge of our time will be enabling society to benefit from information-centric innovations while ensuring that personal privacy is protected.

A lot of people criticize how Web privacy is handled in the United States, in that it forces users to keep up with an endless list of privacy notices that they never really read.

It is clear that the current framework of notice and choice and consent is under some strain. It puts a lot of the burden on the individual to be able to understand what is happening and then make informed choices. The reality is that some really want to be able to read a privacy statement and have controls and make choices. But our research also tells us that the vast majority of people really just want to feel protected and want to be able to trust online.

Do devices like the Kinect game controller present new privacy challenges?

There are some privacy sensitivities—it can do voice recognition, it can do facial recognition. Protecting privacy in this case involved making sure that none of this information leaves the Kinect device. It’s not storing the information or sharing it with anything else.

Keep Reading

Most Popular

It’s time to retire the term “user”

The proliferation of AI means we need a new word.

Sam Altman says helpful agents are poised to become AI’s killer function

Open AI’s CEO says we won’t need new hardware or lots more training data to get there.

An AI startup made a hyperrealistic deepfake of me that’s so good it’s scary

Synthesia's new technology is impressive but raises big questions about a world where we increasingly can’t tell what’s real.

Taking AI to the next level in manufacturing

Reducing data, talent, and organizational barriers to achieve scale.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.