Apple and Google are scrambling to regain trust after revelations about the way smart phones and tablets handle users’ location data. In a U.S. Senate subcommittee hearing held today, representatives from Apple and Google stressed that their companies had streamlined and clarified their handling of location-based data. But a key unanswered question is how they’ll let third-party app providers share that information.
The problem is that users enjoy location-based services, but most don’t understand what happens to the data they share in exchange for using those services. Senators wondered if location data was being stored securely enough to protect users. They pointed to the lack of privacy policies for many mobile apps, and noted that even when users are aware of what happens to their data, they may find it difficult to control.
For example, until Apple’s update to iOS last week, someone who opted out of location services wasn’t actually turning off all of his device’s location-based sharing. Apple said the problem was due to a bug that has since been corrected.
Guy “Bud” Tribble, vice president of software technology for Apple, testified that “Apple does not track customers’ locations. Apple has never done so and has no plans to do so.”
Tribble said that the location information found on phones represented a portion of a crowd-sourced database that Apple maintains in order to process location information more rapidly than is possible through GPS alone. The company stores the locations of cell towers and Wi-Fi hot spots collected from millions of devices. User devices note which towers and hot spots they can connect to, and use that to quickly deduce location. He said that the information stored on iPhones was never a user’s location.
But Ashkan Soltani, an independent researcher and consultant who also testified at the hearing, suggested that Tribble’s explanation was disingenuous. For users who live in urban areas, Soltani said, the data on phones pinpointed hot spots as close as 20 feet away—which, he argued, is effectively the user’s location. “We need a clear definition of what ‘location’ means,” he said, and also called for more clarity about what constituted an “opt-in” policy.
The picture gets even muddier when third-party apps are considered. “Users don’t have a very good idea about what a lot of the applications on their phones are doing,” says Stuart Anderson, cofounder of Whisper Systems, a company that makes security and privacy software for Android phones. “Applications ask for very broad permissions.” Anderson notes that these permissions include the functions users expect, but might also cover unexpected actions from third-party code, such as monitoring used by advertisers.
To make matters worse, users don’t have fine-grained control—Anderson notes that they don’t have the ability to adjust a phone’s behavior or give selective permissions (his company recently released a product designed to do just that).
Apple, which observes a strict review policy for entrants to the App Store, seems to hope that its design standards will keep third-party apps on their best behavior. “What we need to do is put things in the user interface that make it clear what the app is planning to do,” Tribble said. He pointed out that Apple currently has an icon that shows whether an app has used location data within the past 24 hours. He said that Apple doesn’t believe that providing a technical means of limiting the information that apps can access would work. Instead, he said, the combination of the review process and design decisions would provide sufficient quality control.
Alan Davidson, Google’s director of public policy for the Americas, pointed to Google’s contrasting vision of an open app market. Instead of Google reviewing apps, Android devices are designed to recognize what an app is asking for, and alert the user, he said. Google has worked hard to make the resulting notifications understandable and useful, he said.
Both companies argued that they weren’t responsible for what apps do on their platforms. Such a stance, however, is unlikely to appease regulators or customers, which means that Apple and Google may soon have to take more responsibility for how third parties behave on their platforms
This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist
An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.
The Biggest Questions: What is death?
New neuroscience is challenging our understanding of the dying process—bringing opportunities for the living.
Data analytics reveal real business value
Sophisticated analytics tools mine insights from data, optimizing operational processes across the enterprise.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.