Skip to Content
Uncategorized

Apple Ignored Warning on Address-Book Access

The company knew in 2010 that an app was grabbing users’ personal information.
February 16, 2012

Apple was warned as long ago as 2010 that the popular Gowalla location-sharing iPhone app was uploading users’ address books without alerting them, Technology Review has learned.

This raises questions about why Apple didn’t do then what it announced it would do yesterday. In a statement, the company said software upgrades for iPhones would be issued to protect users from the practice, which is forbidden.

Apple’s statements follow a series of revelations over the past week concerning apps that access users’ address books. The revelations began when an independent developer discovered that the two-million-user-strong social network Path collects users’ address books, assembling vast collections of names, e-mails, and phone numbers without consent. Others found that some other popular apps, including the location-sharing services Foursquare and Gowalla, do the same. Transmitting and storing users’ address books exposes them to an increased risk of their personal data being leaked, perhaps through an attack like the one that extracted credit-card details from Sony last year.

The criticism that followed these discoveries—compounded by evidence that Apple ignored a warning about such behavior from academic researchers in 2010—has led to calls for the company to alter iOS and reform its famously opaque application approval process.

In the longer term, all smart-phone operating systems may need more effective privacy controls to better explain what personal data they collect, and to let users opt out. Google’s Android mobile operating system already requires apps to receive explicit permission to access contact books or other private data, but app makers do not need to explain how that information will be stored or used, and many users seem not to fully understand what they are handing over.

In 2010, graduate student Manuel Egele and colleagues at the University of California, Santa Barbara, used a tool called PiOS to scan 1,400 iPhone applications for signs that they leaked sensitive user data. PiOS flagged Gowalla’s app because it stealthily uploaded a user’s entire address book to the company’s servers when a user viewed his or her list of phone contacts through the app.

That was a clear breach of user privacy, and of Apple’s own rules for inclusion in the App Store, says Egele, now a postdoctoral researcher at UCSB. But when Apple was contacted about it, a series of representatives showed little interest, he says. “We even took screenshots that showed it was being sent unencrypted,” he says. “They said, ‘If you have a privacy concern, you should contact the developer.’ ” Egele and colleagues presented a peer-reviewed paper on the work, including an account of their Gowalla finding, last year.

Apple did not reply to inquiries about the 2010 incident. But its first public statement on the address-book saga, made yesterday, implied that it had only just become aware of the issue.

Sooner or later, Apple may have to make more significant changes, says Ty Rollin, chief technology officer of Mobiquity, a large app development agency in Wellesley, Massachusetts. The existing design of iOS made what Path and others did “easy,” he says, and it doesn’t seem to safeguard personal data. Apple should add detailed privacy settings that provide fine-grained control over what different apps can do with the data on a phone, similar to those provided for Facebook apps, says Rollins. “That needs to happen to phones, too,” he says. “I don’t know why they’re taking this piecemeal approach now. Maybe they were trying to maintain this pristine interface.”

Apple has a reputation for tightly controlling what users can do with their mobile devices and for enforcing strict rules on which apps are permitted into the App Store. Yet in the case of Path, and some other apps, it did not seem to impose those rules. That is problematic, because Apple has chosen to rely on those rules to protect users from an app’s behavior, rather than on technical features built into iOS. Technically, an iOS app could access other personal data, including photos, music playlists, recently viewed videos, and a device’s unique IMEI identifier, which can be used for ad tracking. No one has yet reported that any popular apps improperly use that data, however.

Within the startup community, ready access to user data has been seen as a powerful tool, says Aza Raskin, cofounder of mobile health startup Massive Health. That perception may now change. “The more you know about someone, the better the feature set can be,” he says. “Privacy is sadly something that most people don’t think about [because] there isn’t enough consumer demand.” Path and others copied address books so they could inform their users when friends also joined and encourage more use of their social networks.

Apple’s aura of control may have convinced developers, security researchers, and users that personal data was being handled properly. “We’re seeing some of the disadvantages of a closed ecosystem,” says Raskin. “If that was a Web product, this would have been discovered long ago.”

Google takes the opposite approach with the Android Market. It doesn’t actively vet apps, but instead has built features into the operating system that make the data that apps can access transparent to a user. In practice, however, this may not provide much better protection than iOS does.

Although an Android user is asked to approve the data that an Android app can access, many people hurriedly tap “OK” rather than reading that list as they rush to try out their new app, says Adrienne Porter Felt of the University of California, Berkeley. She and colleagues are writing up the results of a study of how people handle Android app permissions. “Most people don’t pay attention to them. A small amount of people do, about 17 percent,” says Porter Felt. Studies on security warnings in browsers and on Microsoft Windows have shown that repeated exposure to such warnings dulls their impact.

There are 174 different types of permission that Android apps are required to ask for, says Porter Felt, compared to just two on iOS—for apps that want location access or that want to send push notifications.

Raskin of Massive Health says that Apple now has an incentive to develop a fundamentally new method of user privacy controls—one that does not bombard them with dialogs or present a complex panel of options. “This is something where they can really push the bar forward.”

Apple may be motivated by more than the bad press triggered by the Path case. Two members of the U.S. House of Representative’s Energy and Commerce Committee wrote to Apple CEO Tim Cook yesterday to ask a series of questions about the access that apps can have to users’ contact data. The U.S. Federal Trade Commission has become increasingly interested in what tech companies do with user data in recent years, and it could conceivably decide that Apple has neglected its responsibility to protect users. Last autumn, both Google and Facebook agreed to 20 years of regular privacy audits by the FTC after the commission charged them, separately, of “deceptive” use of private data.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.