How App Developers Leave the Door Open to NSA Surveillance
News that the National Security Agency has for years harvested personal data “leaked” from mobile apps such as Angry Birds triggered a fresh wave of chatter about the extent of the NSA’s reach yesterday. However the NSA and its U.K. equivalent, GCHQ, hardly had to break much technical ground to hoover up that data. Few mobile apps implement encryption technology to protect the data they send over the Internet, so the agencies could trivially collect and decode that data using their existing access to Internet networks.
Documents seen and published by the New York Times and Guardian newspapers show that the NSA and GCHQ can harvest information such as a person’s age, location, and sexual orientation from the data sent over the Internet by apps. Such personal details are contained in the data that apps send back to the companies that maintain and support them. This includes data sent to companies that serve and target ads in mobile apps.
“This is evidence of negligent levels of insecurity by app companies, says Peter Eckersly, technology projects director for the Electronic Frontier Foundation. Eckersly says his efforts to persuade companies to secure Web traffic shows widespread disregard for the risks of sending people’s data over the Internet without protections against interception. “Most companies have no legitimate reason” not to secure that data, says Eckersly. “Often the security and privacy of their users is so far down the priority list that they haven’t even thought about doing it.”
A 2012 study of 13,500 Android apps by researchers in Germany found that only 0.8 percent used encrypted connections exclusively, and that 43 percent use no encryption at all. Last week mobile app security company MetaIntell reported that 92 percent of the 500 most popular Android applications communicated some data insecurely.
It is often difficult to tell whether an app is using encryption or not to transmit data. Web browsers show a padlock icon next to a site’s Web address if it is using encryption, but there is no such equivalent for mobile apps. Manually checking whether a mobile app is securing data transfers involves inspecting network logs to examine how an app is connecting to servers.
The documents published on Monday single out Google Maps as leaking particularly useful data for surveillance purposes. Documents from both the NSA and GCHQ note how search queries intercepted from this app can reveal a person’s movements. A 2008 document from GCHQ states that a system set up to intercept that data “effectively means that anyone using Google Maps on a smartphone is working in support of a G.C.H.Q. system.”
Google made encryption the default for its Web search last September but does not publicize which of its mobile apps use encryption. A company spokesperson told MIT Technology Review that current versions of the Google Maps app use encryption to protect data sent back to the company’s servers. That suggests intelligence agencies can no longer see the places people are searching for by intercepting Internet traffic.
The leaked documents also highlight how ad targeting technology built into many apps can leak personal information. Many app companies make use of technology from third party ad companies that collect and transmit ad-tracking and ad-targeting data (see “Mobile-Ad Firms Seek New Ways to Track You” and “Get Ready for Ads That Follow You from One Device to the Next”).
That data often contains “profile” data about a person, such as gender, approximate age, and location. A 2012 GCHQ report details technology designed to pluck such profiles from the data transmitted by the game Angry Birds. MetaIntell’s analysis of the current Android version of that app found that it sends unencrypted data to AdMob, the mobile ad company owned by Google. The 2012 report also singles out ad company Millennial, which compiles profiles that can also include a person’s ethnicity, marital status, and sexual orientation. A spokesperson for Millennial told MIT Technology Review that the company only gets to see data that its partners have permission to collect from their users and that ads are not targeted based on sexual orientation.
Ad technology companies such as Millennial have fewer incentives to devote time to protecting the data they transmit than app makers because they work behind the scenes. Ad companies appear particularly unlikely to make use of encryption, says independent researcher Ashkan Soltani. He contributed to a 2010 survey of mobile app privacy and security by the Wall Street Journal that found the majority of data sent by apps was unprotected by encryption. “Most of the ad platforms don’t support it.”
Marc Rogers, principal security researcher at mobile security company Lookout, says Monday’s news could help to change things. “As an industry, mobile app developers and especially mobile advertisers will need to shift their understanding of what should be considered sensitive to include any personal information sent over the wire,” he says.
If that doesn’t happen, it won’t be due to the technical and financial burdens. “With the state of technology today, no mobile app should be given a pass on deploying proper encryption,” says Rogers.
How Rust went from a side project to the world’s most-loved programming language
For decades, coders wrote critical systems in C and C++. Now they turn to Rust.
The inside story of how ChatGPT was built from the people who made it
Exclusive conversations that take us behind the scenes of a cultural phenomenon.
Design thinking was supposed to fix the world. Where did it go wrong?
An approach that promised to democratize design may have done the opposite.
Sam Altman invested $180 million into a company trying to delay death
Can anti-aging breakthroughs add 10 healthy years to the human life span? The CEO of OpenAI is paying to find out.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.