Like the atomic bomb in the waning days of World War II, the computer virus known as Stuxnet, discovered in 2010, seemed to usher in a new era of warfare. In the era of cyberwar, experts warned, silent, software-based attacks will take the place of explosive ordinance, tanks, and machine guns, or at least set the stage for them.
Or maybe not. Almost four years after it was first publicly identified, Stuxnet is an anomaly: the first and only cyberweapon ever known to have been deployed. Now some experts in cybersecurity and critical infrastructure want to know why. Are there fewer realistic targets than suspected? Are such weapons more difficult to construct than realized? Or is the current generation of cyberweapons simply too well hid?
Such questions were on the minds of the world’s top experts in the security of industrial control systems last week at the annual S4 conference outside Miami. S4 gathers the world’s top experts on the security of nuclear reactors, power grids, and assembly lines.
At S4 there was broad agreement that—long after Stuxnet’s name has faded from the headlines—industrial control systems like the Siemens Programmable Logic Controllers are still vulnerable.
Eireann Leverett, a security researcher at the firm IOActive, told attendees at the conference that commonplace security practices in the world of enterprise information technology are still uncommon among vendors who develop industrial control systems (see “Protecting Power Grids from Hackers Is a Huge Challenge”). Leverett noted that modern industrial control systems, which sell for thousands of dollars per unit, often ship with software that lacks basic security controls like user authentication, code signing to prevent unauthorized software updates, or event logging to allow customers to track changes to the device.
It is also clear that, in the years since Stuxnet came to light, developed and developing nations alike have seized on cyber operations as a fruitful new avenue for research and development (see “Welcome to the Malware Industrial Complex”). Laura Galante, a former U.S. Department of Defense intelligence analyst who now works for the firm Mandiant, said that the U.S. isn’t just tracking the activities of nations like Russia and China, but also Syria and Stuxnet’s target of choice: Iran. Galante said cyberweapons give smaller, poorer nations a way to leverage asymmetric force against much larger foes.
Even so, truly effective cyberweapons require extraordinary expertise. Ralph Langner, perhaps the world’s top authority on the Stuxnet worm, argues that the mere hacking of critical systems doesn’t count as cyberwarfare. For example, Stuxnet made headlines for using four exploits for “zero day” (or previously undiscovered) holes in the Windows operating system. But Langner said the metallurgic expertise needed to understand the construction of Iran’s centrifuges was far more impressive. Those who created Stuxnet needed to know the exact amount of pressure or torque needed to damage aluminum rotors within them, sabotaging the country’s uranium enrichment operation.
Concentrating on software-based tools that can cause physical harm sets a much higher bar for discussions of cyberweapons, Langner argues. By that standard, Stuxnet was a true cyberweapon, but the 2012 Shamoon attack against the oil giant Saudi Aramco and other oil companies was not, even though it erased the hard drives of the computers it infected.
Some argue that the conditions for using such a destructive cyberweapon simply haven’t arisen again—and aren’t likely to for a while. Operations like Stuxnet—stealth projects designed to slowly degrade Iran’s enrichment capability over years—are the exception rather than the rule, said Thomas Rid of the Department of War Studies at Kings College in London. “There are not too many targets that would lend themselves to a covert campaign as Stuxnet did,” Rid said.
Rid told attendees that the quality of the intelligence gathered on a particular target makes the difference between an effective cyberweapon and a flop.
It’s also possible that other cyberweapons have been used, but the circumstances surrounding their use are a secret, locked up by governments as “classified” information, or protected by strict nondisclosure agreements.
Indeed, Langner, who works with some of the world’s leading industrial firms and governments, said he knows of one other true physical cyberattack, this one tied to a criminal group. But he wouldn’t talk about it.
Industrial control professionals and academics complain that the information needed to research future attacks are being kept out of the public domain. And public utilities, industrial firms, and owners of critical infrastructure are just now becoming aware that systems they assumed were cordoned off from the public Internet very often are not.
Meanwhile, technology is driving even more rapid and transformative changes as part of what’s called the Internet of things. Ubiquitous Internet connectivity combined with inexpensive and tiny computers and sensors will soon allow autonomous systems to communicate directly with each other (see “Securing the Smart Home, from Toasters to Toilets”).
Without proper security features built into industrial products from the get-go, the potential for attacks and physical harm increase dramatically. “If we continue to ignore the problem, we are going to be in deep trouble,” Langner said.