We noticed you're browsing in private or incognito mode.

To continue reading this article, please exit incognito mode or log in.

Not a subscriber? Subscribe now for unlimited access to online articles.

Business Report

The FTC's Privacy Cop Cracks Down

Washington’s consumer protection agency is making sure that Internet “privacy” lives up to its name.

The job of the U.S. Federal Trade Commission is to protect consumers from deceptive and unfair business practices. And lately, “deceptive and unfair” has started to sound like a good description of the treatment consumers find on the Internet.

Consumer advocate: The FTC’s David Vladeck has brought cases against Google and Facebook for privacy violations.

Social networks and advertisers collect huge amounts of information as people surf the Web. Yet few consumers understand—or even read—the complex privacy agreements they sign on to. What’s more, some Internet firms flat-out ignore them as well.

This story is part of our July/August 2012 Issue
See the rest of the issue

Recently, the 1,100-employee FTC has been trying to rein in corporate behavior with respect to people’s online personal information. During the last two years, it’s levied fines against data brokers like Spokeo and brought high-profile cases against both Facebook and Google for violating privacy promises the companies made to hundreds of millions of consumers.

The commission has been acting tough. But its authority is fundamentally limited. It works from laws—like the 1970 Fair Credit Reporting Act—that were passed before advertisers could track what we browse online and before smart phones could pinpoint our locations. The FTC thinks broader new consumer protections are needed.

The agency has been negotiating with Internet firms on a voluntary standard known as “Do Not Track.” The idea is to give consumers the ability to opt out of being followed across the Web as they browse (and for companies to make privacy promises the FTC can enforce). In March, the FTC’s five commissioners went further, calling on Congress to pass new “baseline” privacy and data security legislation that would lay out the responsibilities of companies that collect personal data online or off.

David Vladeck, a former law professor and longtime litigator with the consumer watchdog group Public Citizen, became director of the FTC’s Bureau of Consumer Protection in 2009. Last week, he spoke with Technology Review business editor Jessica Leber.

TR: What are the risks to consumers online?

Vladeck: There are several. We’ve seen a migration of traditional frauds to the Internet. In the last 18 months alone, we’ve shut down three Internet scams that bilked consumers out of nearly one billion dollars. Another is privacy. The FTC wants to ensure that consumers have control over their personal information and have easy, effective, and persistent ways to exercise that control. Third, we worry about malicious attacks—malware, spyware, spam—that threaten to impair the usefulness and safety of the Internet.

You’ve brought major cases against Facebook and Google, and you’re requiring them to undergo audits every two years until 2032. What is the significance of those cases?

These are examples of enforcement actions we brought to ensure that companies adhere to the promises they make to consumers about privacy. I think the commitment that Google and Facebook have made is really an important one. Auditors are going to come in and make sure they are actually meeting the commitments laid out in their privacy policy. The audits are designed to make sure that companies bake privacy in at every step of offering a product or service. This is going to require the expenditure of a lot of money and a lot of time for companies that did not start out doing things this way. And I think it’s fair to say that neither Facebook nor Google did.

They’ve got to go back and rebuild their business in a way that takes privacy into account. I think these are important signals to industry. These are the biggest kids on the block.

You arrived at the FTC wanting to change its approach to privacy. What was the problem?

Our privacy framework developed principally prior to the advent of the Internet. That started to fray already as the Internet became a principal means of communication. Companies would develop privacy policies. They would be hard to find on the Internet. They were written by lawyers like me who use privacy policies not only to talk about how data would be used and collected, but also to disclaim liability and to address every jot and tittle the company might want to address.

When I got here, I think there was a shared sense that the paradigm that had served in a paper world was not translating very well to a more digital world. We’ve been trying to change that paradigm to depend less heavily on incomprehensible privacy notices and to give consumers control over their data.

Is that why the FTC has called on Congress to pass new privacy laws?

We’ve requested congressional action in two spheres. One is in data security. Part of privacy is keeping data secure. We see, time and again, companies holding onto really sensitive information and not taking reasonable precautions to protect that data. We want Congress to give us the authority to impose civil penalties on companies that don’t respect their obligation to safeguard consumer information. Most recently, we’ve urged Congress to enact baseline privacy legislation. We can push basic privacy protections through public education, policy making, and enforcement. But baseline privacy legislation would give us a broader tool. It would also do a better job leveling the playing field so that companies that respect privacy are not disadvantaged in the marketplace.

In the meantime, your agency recommended that Internet companies voluntarily adopt a “Do Not Track” policy. What if they don’t?

We hope that industry, and the browser makers and the advertisers, all sit down and come up with a Do Not Track regime that meets the markers we’ve laid down. I think if they don’t, there’s a good chance that Congress will impose this on its own.

How can we trust that companies will abide by Do Not Track if it’s voluntary?

We believe if a company makes that commitment and violates it, we’ll be able to detect that, and that will give rise to an enforcement case. We have the authority to basically hold people to their promises over privacy. We have technologists on staff who assure us that there are ways already available that we and others can detect tracking, no matter how sophisticated.

Is that why you’ve been staffing up on technology experts?

I don’t think we can be an effective enforcement agency unless we have the technological capability to detect and prove in court that there are violations. About two years ago, we started to build the first forensic mobile lab, for looking at mobile devices. Our concern is that we are seeing both privacy violations and fraud migrate to smart phones. And we wanted the ability not only to figure out that they were happening, but to capture evidence in real time that we could then use in a judicial proceeding. We proved, for example, that a children’s app company was collecting children’s geolocation data without getting parental consent.

But there are tons of challenges. There is off-the-shelf software you can use to do evidence captures for regular computers. For the moment, in mobile there is none that does the kinds of broad evidence capture that we need. So we’ve had to be ingenious in terms of assembling our lab.

Does the forensic laboratory keep an eye on apps generally, or does it focus on investigations?

We’re doing investigations, which we’re not going to tell you about right here. One example that gives you a sense of the work: We did a report on children’s apps about three months ago. We tested probably 500 or 600 apps.

What mobile privacy questions do you think will be important in the near future?

As you know, overseas people now use their smart phones as their wallets. That technology is coming to the U.S., and coming very quickly. It’s obviously a great convenience for consumers, but there are serious privacy and security issues. We also held a huge conference in May on what we call “dot-com disclosures.” That is—how can you make effective [privacy] disclosures on your smart phone given the small real estate?

[Editor’s note: The FTC has clarified that it did not download or test the children’s apps. As described in its report, it studied the privacy disclosures and descriptions of those apps.]

AI is here. Will you lead or follow? Countdown to EmTech Digital 2019 has begun.

Register now
Next in this Business Report
The Value of Privacy

Internet advertising is the global $70 billion business that powers services like Google and Facebook. But has tracking of Web users gone too far?

Want more award-winning journalism? Subscribe to Print + All Access Digital.
  • Print + All Access Digital {! insider.prices.print_digital !}*

    {! insider.display.menuOptionsLabel !}

    The best of MIT Technology Review in print and online, plus unlimited access to our online archive, an ad-free web experience, discounts to MIT Technology Review events, and The Download delivered to your email in-box each weekday.

    See details+

    12-month subscription

    Unlimited access to all our daily online news and feature stories

    6 bi-monthly issues of print + digital magazine

    10% discount to MIT Technology Review events

    Access to entire PDF magazine archive dating back to 1899

    Ad-free website experience

    The Download: newsletter delivery each weekday to your inbox

    The MIT Technology Review App

You've read of three free articles this month. for unlimited online access. You've read of three free articles this month. for unlimited online access. This is your last free article this month. for unlimited online access. You've read all your free articles this month. for unlimited online access. You've read of three free articles this month. for more, or for unlimited online access. for two more free articles, or for unlimited online access.