To keep hackers from commandeering machines remotely, both plans include features that allow data access only to a user who is physically present. Palladium, for instance, checks whether commands are really coming from the computer’s keyboard. The systems, which also scan for changes in the hardware and software since the previous time the computer booted up, can block access to sealed data if there are signs of tampering or if unauthorized software such as a virus tries to access cryptographic functions. And both Palladium and the Alliance scheme allow other similarly equipped computers to ask the system whether its machine is in a configuration they can trust; the chip always answers truthfully. This capability could allow a computer to query, say, an online banking site to make sure its security is satisfactory before starting an exchange of sensitive information. The overall goal, says Microsoft software architect Paul England, is to have computers “unconditionally protected against software attack.”But could the cure be worse than the disease? Proponents argue that the technology will at last keep data such as passwords, financial and medical records, and proprietary secrets safe from theft, while also preventing damage to computer networks from viruses and other software attacks. Critics, however, point out that the new chips can be used also to give software makers and content providers an unprecedented degree of control over users’ machines. Opponents raise the specter of censorship of material on personal computers. They also warn about anticompetitive practices and draconian copyright enforcement through digital rights management. Trusted computing “could be good, could be bad,” says Seth Schoen, at the Electronic Frontier Foundation, a nonprofit organization in San Francisco, CA. “It allows a lot of things to happen, which in some sense couldn’t have happened before.”
Ross Anderson, a University of Cambridge computer scientist and an outspoken opponent of both plans, says trusted computing “can be useful, but it’s a lot harder to do right than you might think.” The systems could be used for remote detection and deletion of illegal software copies and even to exact payment each time a user plays a downloaded song, Anderson says. He also suggests that governments could use such features to actively censor documents and photos deemed politically sensitive or morally offensive: a state could require the chip to check online for valid licenses or lists of prohibited material and authorize it to delete data it deemed illicit.
Anderson and Schoen worry also that the new security technologies could be used to stifle competition. A maker of word processing software, for instance, could build in cryptographic keys other companies wouldn’t have, preventing competitors’ document formats from working on any machine that runs its software. That “would be a disadvantage for users,” says Schoen. “But if a software publisher has a lot of market power, they can get away with including features that are a disadvantage to users.”