The job of the U.S. Federal Trade Commission is to protect consumers from deceptive and unfair business practices. And lately, “deceptive and unfair” has started to sound like a good description of the treatment consumers find on the Internet.
Social networks and advertisers collect huge amounts of information as people surf the Web. Yet few consumers understand—or even read—the complex privacy agreements they sign on to. What’s more, some Internet firms flat-out ignore them as well.
Recently, the 1,100-employee FTC has been trying to rein in corporate behavior with respect to people’s online personal information. During the last two years, it’s levied fines against data brokers like Spokeo and brought high-profile cases against both Facebook and Google for violating privacy promises the companies made to hundreds of millions of consumers.
The commission has been acting tough. But its authority is fundamentally limited. It works from laws—like the 1970 Fair Credit Reporting Act—that were passed before advertisers could track what we browse online and before smart phones could pinpoint our locations. The FTC thinks broader new consumer protections are needed.
The agency has been negotiating with Internet firms on a voluntary standard known as “Do Not Track.” The idea is to give consumers the ability to opt out of being followed across the Web as they browse (and for companies to make privacy promises the FTC can enforce). In March, the FTC’s five commissioners went further, calling on Congress to pass new “baseline” privacy and data security legislation that would lay out the responsibilities of companies that collect personal data online or off.
David Vladeck, a former law professor and longtime litigator with the consumer watchdog group Public Citizen, became director of the FTC’s Bureau of Consumer Protection in 2009. Last week, he spoke with Technology Review business editor Jessica Leber.
TR: What are the risks to consumers online?
Vladeck: There are several. We’ve seen a migration of traditional frauds to the Internet. In the last 18 months alone, we’ve shut down three Internet scams that bilked consumers out of nearly one billion dollars. Another is privacy. The FTC wants to ensure that consumers have control over their personal information and have easy, effective, and persistent ways to exercise that control. Third, we worry about malicious attacks—malware, spyware, spam—that threaten to impair the usefulness and safety of the Internet.
They’ve got to go back and rebuild their business in a way that takes privacy into account. I think these are important signals to industry. These are the biggest kids on the block.
You arrived at the FTC wanting to change its approach to privacy. What was the problem?
Our privacy framework developed principally prior to the advent of the Internet. That started to fray already as the Internet became a principal means of communication. Companies would develop privacy policies. They would be hard to find on the Internet. They were written by lawyers like me who use privacy policies not only to talk about how data would be used and collected, but also to disclaim liability and to address every jot and tittle the company might want to address.
When I got here, I think there was a shared sense that the paradigm that had served in a paper world was not translating very well to a more digital world. We’ve been trying to change that paradigm to depend less heavily on incomprehensible privacy notices and to give consumers control over their data.
Is that why the FTC has called on Congress to pass new privacy laws?
We’ve requested congressional action in two spheres. One is in data security. Part of privacy is keeping data secure. We see, time and again, companies holding onto really sensitive information and not taking reasonable precautions to protect that data. We want Congress to give us the authority to impose civil penalties on companies that don’t respect their obligation to safeguard consumer information. Most recently, we’ve urged Congress to enact baseline privacy legislation. We can push basic privacy protections through public education, policy making, and enforcement. But baseline privacy legislation would give us a broader tool. It would also do a better job leveling the playing field so that companies that respect privacy are not disadvantaged in the marketplace.
In the meantime, your agency recommended that Internet companies voluntarily adopt a “Do Not Track” policy. What if they don’t?
We hope that industry, and the browser makers and the advertisers, all sit down and come up with a Do Not Track regime that meets the markers we’ve laid down. I think if they don’t, there’s a good chance that Congress will impose this on its own.
How can we trust that companies will abide by Do Not Track if it’s voluntary?
We believe if a company makes that commitment and violates it, we’ll be able to detect that, and that will give rise to an enforcement case. We have the authority to basically hold people to their promises over privacy. We have technologists on staff who assure us that there are ways already available that we and others can detect tracking, no matter how sophisticated.
Is that why you’ve been staffing up on technology experts?
I don’t think we can be an effective enforcement agency unless we have the technological capability to detect and prove in court that there are violations. About two years ago, we started to build the first forensic mobile lab, for looking at mobile devices. Our concern is that we are seeing both privacy violations and fraud migrate to smart phones. And we wanted the ability not only to figure out that they were happening, but to capture evidence in real time that we could then use in a judicial proceeding. We proved, for example, that a children’s app company was collecting children’s geolocation data without getting parental consent.
But there are tons of challenges. There is off-the-shelf software you can use to do evidence captures for regular computers. For the moment, in mobile there is none that does the kinds of broad evidence capture that we need. So we’ve had to be ingenious in terms of assembling our lab.
Does the forensic laboratory keep an eye on apps generally, or does it focus on investigations?
We’re doing investigations, which we’re not going to tell you about right here. One example that gives you a sense of the work: We did a report on children’s apps about three months ago. We tested probably 500 or 600 apps.
What mobile privacy questions do you think will be important in the near future?
As you know, overseas people now use their smart phones as their wallets. That technology is coming to the U.S., and coming very quickly. It’s obviously a great convenience for consumers, but there are serious privacy and security issues. We also held a huge conference in May on what we call “dot-com disclosures.” That is—how can you make effective [privacy] disclosures on your smart phone given the small real estate?
[Editor’s note: The FTC has clarified that it did not download or test the children’s apps. As described in its report, it studied the privacy disclosures and descriptions of those apps.]
Anti-aging drugs are being tested as a way to treat covid
Drugs that rejuvenate our immune systems and make us biologically younger could help protect us from the disease’s worst effects.
A quick guide to the most important AI law you’ve never heard of
The European Union is planning new legislation aimed at curbing the worst harms associated with artificial intelligence.
It will soon be easy for self-driving cars to hide in plain sight. We shouldn’t let them.
If they ever hit our roads for real, other drivers need to know exactly what they are.
Crypto is weathering a bitter storm. Some still hold on for dear life.
When a cryptocurrency’s value is theoretical, what happens if people quit believing?
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.