Skip to Content

The Pure Software Act of 2006

100 years ago, Congress passed a law requiring honest labeling of food and drugs. Now the time has come to do the same for software.

Spyware is the scourge of desktop computing. Yes, computer worms and viruses cause billions of dollars in damage every year. But spyware-programs that either record your actions for later retrieval or that automatically report on your actions over the Internet-combines commerce and deception in ways that most of us find morally repugnant.

Worms and viruses are obviously up to no good: these programs are written by miscreants and released into the wild for no purpose other than wreaking havoc. But most spyware is authored by law-abiding companies, which trick people into installing the programs onto their own computers. Some spyware is also sold for the explicit purpose of helping spouses to spy on their partners, parents to spy on their children, and employers to spy on their workers. Such programs cause computers to betray the trust of their users.

Until now, the computer industry has focused on technical means to control the plague of spyware. Search-and-destroy programs such as Ad-Aware will scan your computer for known spyware, tracking cookies, and other items that might compromise your privacy. Once identified, the offending items can be quarantined or destroyed. Firewall programs like ZoneAlarm takes a different approach: they don’t stop the spyware from collecting data, but they prevent the programs from transmitting your personal information out over the Internet.

But there is another way to fight spyware-an approach that would work because the authors are legitimate organizations. Congress could pass legislation requiring that software distributed in the United States come with product labels that would reveal to consumers specific functions built into the programs. Such legislation would likely have the same kind of pro-consumer results as the Pure Food and Drug Act of 1906-the legislation that is responsible for today’s labels on food and drugs.

The Art of Deception

Mandatory software labeling is a good idea because the fundamental problem with spyware is not the data collection itself, but the act of deception. Indeed, many of the things that spyware does are done also by non-spyware programs. Google’s Toolbar for Internet Explorer, for example, reports back to Google which website you are looking at so that the toolbar can display the site’s “page rank.” But Google goes out of its way to disclose this feature-when you install the program, Google makes you decide whether you want to have your data sent back or not. “Please read this carefully,” says the Toolbar’s license agreement, “it’s not the usual yada yada.”

Spyware, on the other hand, goes out of its way to hide its true purpose. One spyware program claims to automatically set your computer’s clock from the atomic clock operated by the U.S. Naval Observatory. Another program displays weather reports customized for your area. Alas, both of these programs also display pop-up advertisements when you go to particular websites. (Some software vendors insist that programs that only display advertisements are not spyware, per se, but rather something called adware, because they display advertisements. Most users don’t care about this distinction.)

Some of these programs hide themselves by not displaying icons when they run and even removing themselves from the list of programs that are running on your computer. I’ve heard of programs that list themselves in the Microsoft Windows Add/Remove control panel-but when you go to remove them, they don’t actually remove themselves, they just make themselves invisible. Sneaky.

Yet despite this duplicity, most spyware and adware programs aren’t breaking any U.S. law. That’s because many of these programs disclose what they do and then get the user’s explicit consent. They do this with something that’s called a click-wrap license agreement-one of those boxes full of legal mumbo-jumbo that appears when you install a program or run it for the first time. The text more-or-less spells out all of the covert tricks that these hostile programs might play on your system. Of course, hardly anybody reads these agreements. Nevertheless, the agreements effectively shield purveyors of spyware and adware from liability. After all, you can’t claim that the spyware was monitoring your actions without your permission if you gave the program permission by clicking on that “I agree” button.

Uniform standards for labeling software wouldn’t replace the need for license agreements, but they would make it harder for companies to bury a program’s functions. Such legislation-call it the Pure Software Act of 2006-would call for the Federal Trade Commission to establish standards for the mandatory labeling of all computer programs that are distributed within the United States. A labeling requirement would force makers of spyware to reveal their program’s hidden features.

The Historical Precedent

As I hinted above, we’ve been down this road before. The Pure Food and Drug Act of 1906 was passed by Congress to deal with a remarkably similar set of deceptive business practices. The problem back in 1906 was foods and drugs that were sold with misleading labels, or without labels at all.

The 1906 Act required that every drug sold in the United States be delivered to the consumer in a package that states the strength, quality, and purity of the drug if they differed from accepted standards. The dose of the drug had to be clearly printed on the outside of the package. A number of ingredients that tended to accompany nineteenth century patent medicines-substances like alcohol, codeine, and cannabis-had to be clearly disclosed as well.

In the case of food, the Act required that labels explicitly mention any artificial colors and flavors-after 1906, you couldn’t sell something called “orange soda” unless it had flavoring that came from genuine oranges. Otherwise you were selling “imitation” or “artificial” orange soda. And every bottle, box, and bag of food needed to clearly indicate the precise weight of the food that was inside the container.

The Pure Food and Drug Act was successful for many reasons. Forcing manufacturers to disclose what was in their products allowed consumers to avoid products that contained things they didn’t want to ingest. For example, many of the snake-oil tonics distributed at the end of the nineteenth century contained significant doses of addictive drugs like codeine or cocaine. Forcing to disclose these drugs on the product’s label, along with a warning that said “may be habit forming,” made it possible for consumers to make informed decisions. Labeling also empowered scientists and eventually consumer groups to check the product makers’ claims. Mandatory labeling put pressure on manufacturers to remove the most objectionable ingredients-a process that continues to this day. Finally, the labels provided additional evidence to lawmakers that was used to justify the crafting of additional legislation.

The parallels between nineteenth century adulterated food products and twenty-first century adulterated software is uncanny. Just as some tonics claimed to do one thing (like grow hair) when they actually did another (made the user intoxicated and chemically dependent on codeine), today we have software that claims to do one thing (set the time of your PC) and actually does another thing (displays ads when you visit particular websites).

So what would a Pure Software Act look like? Judging from 1906 legislation, the best results are likely to come from requiring labels that would directly address the issue of deception. The new law would therefore require that software identify itself as such: no more hidden programs that silently install themselves and then run without any visible evidence. The Pure Software Act would make it illegal for programs to run without revealing themselves though the standard means used by the host operating system. And the Act would require that programs have an “uninstall” feature-or else make it very plain that they do not.

Documenting a program’s installation and providing for its removal is just the start. The Pure Software Act would require that the Federal Trade Commission identify specific practices of software that would have to be explicitly revealed when the programs are distributed and run. Instead of letting companies hide the features of their software with obscurely written legalese buried in click-through license agreements, the legislation would require that the disclosure be made in the form of easy-to-understand icons that could be clicked on for additional information. Clicking on the icon would bring up further explanatory text-perhaps from a website maintained by the Federal Trade Commission. The icons could also be displayed in other places. Under Windows, for example, the Task Manager and the Add/Remove control panel could both display the mandated behavior icons alongside the program’s application icon.

A Modest Proposal

To make my proposal more concrete, I’ve come up with a list of program behaviors that would have to be disclosed, and some representative icons. These icons (created by TechnologyReview.com senior graphic designer Matthew Bouchard) are just samples to illustrate the concept. Actual government-mandated icons would be developed by a team of professionals with expertise in human computer interface, tested on focus groups, and put up for public comment. But these icons are useful to convey the general idea and to start discussion.

Hook: Runs at Boot
Some programs hook themselves in to your computer’s operating system so that they automatically run whenever the computer is rebooted or a user logs in. Other programs don’t. Today there’s no way to tell except by performing a detailed analysis of the computer’s configuration files before and after the program is installed and noting the changes. Any program that installs itself so that it automatically runs would have to display this Hook icon.
Dial: Places a Phone Call
One common spyware scam involves programs that cause your computer to call phone numbers that cost you money. For example, a few years ago some pornographic websites distributed a program called david.exe that caused the victim’s computer to make a long-distance phone call to an Internet service provider in Eastern Europe; the porn company got to keep half of the (exorbitantly high) long distance revenues. Other kinds of scam software might dial 900-numbers or even use your computer to send junk faxes without your knowledge. Documenting that the software has code that could make it dial your phone would be a good way to address this problem.
Modify: Alters Your Computer’s Operating System
Some programs do more than simply install themselves to run at boot-they alter your computer’s operating system. Seeing this icon would give you a reason to ask questions. More likely, forcing this kind of disclosure would simply end the practice on the part of developers.
Monitor: Keeps Track of What You’re Doing
Most programs mind their own business. But some software watches your keystrokes and monitors the Web pages you are viewing even as other programs run in the foreground. Programs can watch as you create files, make copies of every document that’s printed, or simply note when your computer is idle and when it’s in use. The key here is that personal information is being captured by a program when you think that it’s not listening. Perhaps this icon might incorporate a lightening bolt to indicate that the monitored information is reported back over the Internet to someone else.
Displays Pop-Ups
A well-mannered program speaks only when spoken to. Some programs, on the other hand, demand your attention. I was astonished the other day when Microsoft Word 2003 popped a window up on my computer inviting me to participate in some kind of survey. A few years ago I noticed that an electronic wallet program called Gator was opening up windows to competing websites whenever I visited certain online merchants.
Remote Control: Lets Other Programs Take Over Your Computer
In theory, any program that’s running on your computer can take it over and execute commands on the part of others. In practice, only very few programs have the ability to offer others such remote control. Programs that do so should be labeled.
Self-Updates: This Program May Change Its Behavior
One of the most important techniques for software vendors to deal with persistent computer security problems is to have their programs automatically update themselves with code downloaded from the Internet. Programs that have this feature should advertise that capability, because they can change their behavior without any input from the user.
Stuck: Cannot be Uninstalled
Some programs, once installed in your computer, are impossible to dislodge. These programs are typically operating system updates, but it is easy for a clever programmer to make uninstallable spyware as well. Consumers should be informed that there are some programs for which there is no going back.

Rules of Engagement

With the icons would come rules for their use. For instance, many of today’s click-through license agreements say that the user implicitly agrees to any changes in the license agreement unless those changes are “substantive.” But what is substantive? Once a label regime was in place, a substantive change could be legally defined as a change that results in a change of icons-for example, if a self-updating program downloaded a remote-control feature. The law could then require that this sort of change would require new consent on the part of the user.

One tension inherent with any labeling regime is in deciding what gets put on the label and what gets left out. The more information required on the label, the more expensive it will be to produce, and the less likely that consumers would be to actually pay attention to the information. Any regulatory body implementing this policy will need to avoid icon creep-having 23 different icons on each piece of software won’t serve the needs of consumers, it will just cause confusion.

Personally, I’d like my software labels to distinguish between information that’s collected and used in aggregate form and personally identifiable information that’s stockpiled in a large data warehouse. But fundamentally this isn’t about what the program does-it’s about what the company does after the program has reported its information. That is, this is a business practice that should be protected by the company’s privacy policy. Perhaps we need icons there, too. (Years ago, the trade organization TRUSTe tried to have three icons for three different kinds of standard privacy policies; TRUSTe gave up when its member companies balked.)

Another tension is between voluntary and mandatory labeling. I think that mandatory is the way to go. We’re living in a voluntary regime today: Google has done a great job explaining what the Google Toolbar does, but other companies are not so forthcoming. Nearly 100 years’ experience with The Pure Food and Drug Act of 1906 shows that labeling requirements need not be onerous, but they do need to be mandatory-otherwise the good companies label and the bad companies don’t. What’s needed now is to extend this principle to the world of software.

Acknowledgements
I’ve been discussing this proposal for software labeling for several months with associates in Cambridge. At Harvard Law School, Jonathan Zittrain offered very helpful comments; at MIT’s Computer Science and Artificial Intelligence Laboratory, I had useful discussions and comments with my thesis advisors, Rob Miller and David Clark, and with my fellow student, Steven Bauer.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.