Skip to Content

Apple Rolls Out Privacy-Sensitive Artificial Intelligence

Software that analyzes friends’ faces or knows what’s in your photos must be restricted because of privacy concerns, says Apple.
June 13, 2016

On Monday Apple showed off a string of new iPhone features powered by recent advances in artificial intelligence—many of them aping ones already launched by rival Google.

But Apple’s announcements of features like facial recognition or software that knows what’s in your photos, made during its annual Worldwide Developer Conference, were distinct in how much they emphasized privacy.

Craig Federighi, senior vice president of software engineering at Apple, repeatedly stated that machine-learning algorithms able to understand personal data such as photos are being used only within the confines of a person’s iPhone, not on Apple’s cloud servers. “We believe you should have great features and great privacy,” he said.

Apple CEO Tim Cook at the company's WWDC developer conference in San Francisco.

A new version of Apple’s Photos app, coming this fall with a new version of Apple’s mobile operating system, will use facial recognition to maintain virtual albums of snaps containing people you frequently photograph. It will also look at the contents of your photos, so you can search your collection using keywords such as “horses” or “mountains.”

Federighi said those features are powered by deep learning, a technique that underpins significant recent progress in artificial intelligence. They are also playing catch-up with Google, which introduced a photos service with those same features over a year ago (see “Google Rolls Out New Automated Helpers”).

But Federighi said that Apple didn’t want its algorithms to spill data to Apple about the content of user photos. “When it comes to performing advanced deep learning and intelligence of your data, we’re doing it on your device, keeping your personal data under your control,” he said.

Companies including Google and Facebook generally run image-recognition algorithms inside their cloud computing systems, meaning that photos must be uploaded to a company’s servers. Although Google and others have privacy policies governing how data gleaned from customers is used, some experts say it is safer if data such as what’s inside your photos never reaches corporate servers in the first place.

Apple’s mobile operating system update this fall will also see the company’s QuickType mobile keyboard become better at suggesting words, thanks to an ability to understand the context of what you have already typed. For example, it would suggest completing the sentence “The Orioles are playing in the …” with the word “playoffs,” but the sentence “The children are playing in the …” with “yard.”

Again, Google has rolled out this kind of technology more aggressively. Google’s Inbox e-mail app can suggest sentence-long replies to e-mails, and the same feature will be built into the company’s Allo messaging app launching this summer (“See Google Finally Launches Siri-Killer in Search Pivot”).

But Federighi said that Apple had invested in research that allowed the company to collect data on what users type while still protecting their privacy. The company has adopted a method called "differential privacy" that's been widely discussed in academic research. The method makes it possible to collect data such as which words people use most commonly without leaking information about who is using which words, he said.

Federighi said that privacy researcher Aaron Roth, an associate professor at the University of Pennsylvania, was given a chance to review that work. “Incorporating differential privacy broadly into Apple’s technology positions Apple as the clear privacy leader among technology companies today,” said Roth, in a statement released by the company.

Apple’s claims for its new uses of machine learning are in line with its history of emphasizing that features that involve user data such as search are implemented in ways that protect privacy. CEO Tim Cook has said the company is able to take this attitude because Apple’s business model is not built on monetizing customer data.

 

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.