The opportunity for profit helps explain the rise of dozens of data exchanges, data marts, predictive analytic engines, and other intermediaries. It’s also why players such as Google, Facebook, and Zynga, among many others, are finding ways to aggregate ever more information about users. Facebook provides but one example of how extensive this kind of tracking can be. Its seemingly innocuous “Like” button has become ubiquitous online. Click on one of these buttons, and you can instantly share something that pleases you with your friends. But simply visit a page with a “Like” button on it while you’re logged in to Facebook, and Facebook can track what you do there. The first aspect sounds great for consenting adults; the latter is more than a little unsettling. Facebook is hardly alone. A company called Lotame helps target online advertising by placing tags (sometimes known as beacons) on browsers to monitor what users are typing on any Web page they might view.
The potential dark side of Big Data suggests the need for a code of ethical principles. Here are some proposals for how to structure them.
Clarity on Practices: When data is being collected, let users know about it—in real time. Such disclosure would address the issue of hidden files and unauthorized tracking. Giving users access to what a company knows about them could go a long way toward building trust. Google has done this already. If you want to know what Google knows about you, go to www.google.com/ads/preferences, and you can see both the data it has collected and the inferences it, and third parties, have drawn from what you’ve done.
Privacy by Design: Some argue that neither clarity nor simplicity is sufficient. Ann Cavoukian, privacy commissioner for the province of Ontario, coined the phrase “privacy by design” to propose that organizations incorporate privacy protections into everything they do. This does not mean Web and mobile businesses collect no customer information. It simply means they make customer privacy a guiding principle, right from the start. Microsoft, which in 2006 issued a report called “Privacy Guidelines for Developing Software Products and Services,” has embraced this principle, using a renewed emphasis on privacy as a way to differentiate itself; the latest version of Internet Explorer, IE9, lets users activate features that can block third-party ads and content.
Exchange of Value: Walk into a local Starbucks, and you’re likely to feel flattered if a barista remembers your name and favorite beverage. Something similar applies on the Web: the more a service provider knows about you, the greater the chance that you’ll like the service. Radical transparency could make it easier for digital businesses to show customers what they will get in exchange for sharing their personal information. That’s what Netflix did in running a public competition offering third-party developers a $1 million award for creating the most effective movie recommendation engine. It was an open acknowledgement that Netflix was using users’ movie-viewing histories to provide increasingly targeted, and thus more useful, recommendations.
These principles are by no means exhaustive, but they begin to outline how companies might realize the value of Big Data and mitigate its risks. Adopting such principles would also get ahead of policymakers’ well-intentioned but often misguided efforts to rule the digital economy. That said, perhaps the most important rule is one that goes without saying, something akin to the Golden Rule: “Do unto the data of others as you would have them do unto yours.” That kind of thinking might go a long way toward creating the kind of digital world we want-and deserve.
Jeffrey F. Rayport specializes in analyzing the strategic implications of digital technologies for business and organizational design. He is a managing partner of MarketspaceNext, a strategic advisory firm; an operating partner at Castanea Partners; and a former faculty member at Harvard Business School. Carine Carmy contributed research to this article.