The Historical Precedent
As I hinted above, we’ve been down this road before. The Pure Food and Drug Act of 1906 was passed by Congress to deal with a remarkably similar set of deceptive business practices. The problem back in 1906 was foods and drugs that were sold with misleading labels, or without labels at all.
The 1906 Act required that every drug sold in the United States be delivered to the consumer in a package that states the strength, quality, and purity of the drug if they differed from accepted standards. The dose of the drug had to be clearly printed on the outside of the package. A number of ingredients that tended to accompany nineteenth century patent medicines-substances like alcohol, codeine, and cannabis-had to be clearly disclosed as well.
In the case of food, the Act required that labels explicitly mention any artificial colors and flavors-after 1906, you couldn’t sell something called “orange soda” unless it had flavoring that came from genuine oranges. Otherwise you were selling “imitation” or “artificial” orange soda. And every bottle, box, and bag of food needed to clearly indicate the precise weight of the food that was inside the container.
The Pure Food and Drug Act was successful for many reasons. Forcing manufacturers to disclose what was in their products allowed consumers to avoid products that contained things they didn’t want to ingest. For example, many of the snake-oil tonics distributed at the end of the nineteenth century contained significant doses of addictive drugs like codeine or cocaine. Forcing to disclose these drugs on the product’s label, along with a warning that said “may be habit forming,” made it possible for consumers to make informed decisions. Labeling also empowered scientists and eventually consumer groups to check the product makers’ claims. Mandatory labeling put pressure on manufacturers to remove the most objectionable ingredients-a process that continues to this day. Finally, the labels provided additional evidence to lawmakers that was used to justify the crafting of additional legislation.
The parallels between nineteenth century adulterated food products and twenty-first century adulterated software is uncanny. Just as some tonics claimed to do one thing (like grow hair) when they actually did another (made the user intoxicated and chemically dependent on codeine), today we have software that claims to do one thing (set the time of your PC) and actually does another thing (displays ads when you visit particular websites).
So what would a Pure Software Act look like? Judging from 1906 legislation, the best results are likely to come from requiring labels that would directly address the issue of deception. The new law would therefore require that software identify itself as such: no more hidden programs that silently install themselves and then run without any visible evidence. The Pure Software Act would make it illegal for programs to run without revealing themselves though the standard means used by the host operating system. And the Act would require that programs have an “uninstall” feature-or else make it very plain that they do not.
Documenting a program’s installation and providing for its removal is just the start. The Pure Software Act would require that the Federal Trade Commission identify specific practices of software that would have to be explicitly revealed when the programs are distributed and run. Instead of letting companies hide the features of their software with obscurely written legalese buried in click-through license agreements, the legislation would require that the disclosure be made in the form of easy-to-understand icons that could be clicked on for additional information. Clicking on the icon would bring up further explanatory text-perhaps from a website maintained by the Federal Trade Commission. The icons could also be displayed in other places. Under Windows, for example, the Task Manager and the Add/Remove control panel could both display the mandated behavior icons alongside the program’s application icon.
A Modest Proposal
To make my proposal more concrete, I’ve come up with a list of program behaviors that would have to be disclosed, and some representative icons. These icons (created by TechnologyReview.com senior graphic designer Matthew Bouchard) are just samples to illustrate the concept. Actual government-mandated icons would be developed by a team of professionals with expertise in human computer interface, tested on focus groups, and put up for public comment. But these icons are useful to convey the general idea and to start discussion.
Hook: Runs at Boot
|Dial: Places a Phone Call|
One common spyware scam involves programs that cause your computer to call phone numbers that cost you money. For example, a few years ago some pornographic websites distributed a program called david.exe that caused the victim’s computer to make a long-distance phone call to an Internet service provider in Eastern Europe; the porn company got to keep half of the (exorbitantly high) long distance revenues. Other kinds of scam software might dial 900-numbers or even use your computer to send junk faxes without your knowledge. Documenting that the software has code that could make it dial your phone would be a good way to address this problem.
|Modify: Alters Your Computer’s Operating System|
Some programs do more than simply install themselves to run at boot-they alter your computer’s operating system. Seeing this icon would give you a reason to ask questions. More likely, forcing this kind of disclosure would simply end the practice on the part of developers.
|Monitor: Keeps Track of What You’re Doing|
Most programs mind their own business. But some software watches your keystrokes and monitors the Web pages you are viewing even as other programs run in the foreground. Programs can watch as you create files, make copies of every document that’s printed, or simply note when your computer is idle and when it’s in use. The key here is that personal information is being captured by a program when you think that it’s not listening. Perhaps this icon might incorporate a lightening bolt to indicate that the monitored information is reported back over the Internet to someone else.
A well-mannered program speaks only when spoken to. Some programs, on the other hand, demand your attention. I was astonished the other day when Microsoft Word 2003 popped a window up on my computer inviting me to participate in some kind of survey. A few years ago I noticed that an electronic wallet program called Gator was opening up windows to competing websites whenever I visited certain online merchants.
|Remote Control: Lets Other Programs Take Over Your Computer |
In theory, any program that’s running on your computer can take it over and execute commands on the part of others. In practice, only very few programs have the ability to offer others such remote control. Programs that do so should be labeled.
Self-Updates: This Program May Change Its Behavior
|Stuck: Cannot be Uninstalled |
Some programs, once installed in your computer, are impossible to dislodge. These programs are typically operating system updates, but it is easy for a clever programmer to make uninstallable spyware as well. Consumers should be informed that there are some programs for which there is no going back.
Rules of Engagement
With the icons would come rules for their use. For instance, many of today’s click-through license agreements say that the user implicitly agrees to any changes in the license agreement unless those changes are “substantive.” But what is substantive? Once a label regime was in place, a substantive change could be legally defined as a change that results in a change of icons-for example, if a self-updating program downloaded a remote-control feature. The law could then require that this sort of change would require new consent on the part of the user.
One tension inherent with any labeling regime is in deciding what gets put on the label and what gets left out. The more information required on the label, the more expensive it will be to produce, and the less likely that consumers would be to actually pay attention to the information. Any regulatory body implementing this policy will need to avoid icon creep-having 23 different icons on each piece of software won’t serve the needs of consumers, it will just cause confusion.
Another tension is between voluntary and mandatory labeling. I think that mandatory is the way to go. We’re living in a voluntary regime today: Google has done a great job explaining what the Google Toolbar does, but other companies are not so forthcoming. Nearly 100 years’ experience with The Pure Food and Drug Act of 1906 shows that labeling requirements need not be onerous, but they do need to be mandatory-otherwise the good companies label and the bad companies don’t. What’s needed now is to extend this principle to the world of software.
I’ve been discussing this proposal for software labeling for several months with associates in Cambridge. At Harvard Law School, Jonathan Zittrain offered very helpful comments; at MIT’s Computer Science and Artificial Intelligence Laboratory, I had useful discussions and comments with my thesis advisors, Rob Miller and David Clark, and with my fellow student, Steven Bauer.