Microsoft Thinks DRM Can Solve the Privacy Problem
A leader at Microsoft proposes protecting personal data using technology once used to lock down music files.
Today it is all but impossible for consumers to know how their personal data is used by companies that collect and have access to it.
When sharing music online took off in the 1990s, many companies turned to digital rights management (DRM) software as a way to restrict what could be done with MP3s and other music files—only to give up after the approach proved ineffective and widely unpopular. Today Craig Mundie, senior advisor to the CEO at Microsoft, resurrected the idea, proposing that a form of DRM could be used to prevent personal data from being misused.
Speaking at MIT Technology Review’s EmTech conference in Cambridge, Massachusetts, Mundie said that a new approach is necessary because people currently have no way to be sure how data they share with companies will be used.
“There’s too much data being collected in so many ways, and a lot of it in ways that you don’t feel you had a role in the specific transaction,” he said. “Now that you’re just being observed, whether it’s for commercial purposes or other activities, we have to move to a new model.”
Mundie, who until late last year was in charge of Microsoft’s research wing as its chief research and strategy officer, thinks a system of DRM technology, accompanied by laws and regulations to enforce it, could provide the answer. “I think we’re going to have to have a usage-based way of controlling this now,” he said. “One way to do that is to put cryptographic wrappers around these things that control uses of this data.”
Under the model imagined by Mundie, applications and services that wanted to make use of sensitive data, such as a person’s genome sequence or current location, would have to register with authorities. A central authority would distribute encryption keys to applications, allowing them to access protected data in the ways approved by the data’s owners.
The use of cryptographic wrappers would ensure that an application or service couldn’t use the data in any other way. But the system would need to be underpinned by new regulations, said Mundie: “You want to say that there are substantial legal penalties for anyone that defies the rules in the metadata. I would make it a felony to subvert those mechanisms.”
Mundie gave the example of a mobile application that requests permission to access a person’s geolocation, as determined by the sensors on a phone. “The current way, the app doesn’t have to say what it’s going to do with it,” he said. “If the app had to tell you what it was going to do with the data, then you could make a much more informed decision about whether you like that app or you don’t like that app.”
Mundie also said that medical data could be protected this way. He suggested that such measures will become necessary as personal genetic and genomic information become more crucial to medicine.
Discussions within Microsoft, with U.S. regulators, and with other large companies and governments via the World Economic Forum suggest that the approach would be accepted around the world, Mundie claimed. However, he didn’t share any details of specific efforts Microsoft or any other organization was making to develop or test the model.
Though he referred to the use of DRM to protect media files, Mundie avoided mentioning that the technology largely failed to prevent illegal sharing of downloaded music. However, that was in part because there were easy ways to get hold of unprotected versions of files, ripped from CDs and distributed via peer-to-peer networks such as Napster and BitTorrent.
The same is not likely to happen with a person’s genome or location data, although Mundie’s scheme would certainly give bad actors incentive to try and defeat the encryption.
Many existing businesses that rely on personal data would be likely to oppose Mundie’s proposal. He conceded that some kinds of data are so useful that they would probably end up exempt from any restrictions. “I believe that this is going to end up going both ways,” he said. “I predict that we will find there are certain classes of data that become so important to society, for health, education, or security reasons, that society will decide that people can’t opt out.”