On alert: The Pentagon wants better real-time insights about the computer security threats facing companies that operate power plants and other critical pieces of infrastructure. Here, Department of Homeland Security analysts take part in a 2011 drill that simulated a computer assault on an industrial control network.
Last week U.S. defense secretary Leon Panetta warned that critical infrastructure such as power grids or chemical plants could be inactivated or destroyed by a cyber attack, and he pledged that the U.S. would “defend the nation in cyberspace” as it does on land and sea, in air and space.
But with the art of cyber detection and defense lagging far behind the sophistication of attacks (see “Hey, Hackers: Defense Is Sexy, Too”), the U.S. and other nations appear largely unprepared to rapidly detect and respond to an attack on critical infrastructure. That would make it difficult to respond with “decisive action” as Panetta promised, or even to know whom to retaliate against.
Working out the nature and source of an attack is particularly challenging for critical infrastructure systems, which are operated by tried, trusted, and consequently outdated software (see “Old-Fashioned Control Systems Make U.S. Power Grids, Water Plants a Hacking Target”).
“We don’t have technology to secure these systems [and] don’t even have technology to do cyber forensics or logging at the control layer,” says Joe Weiss, managing director of the International Society of Automation’s efforts to create security standards for industrial control systems. Through his consulting firm, he is working with both the Defense Department and infrastructure companies on security efforts.
Although an infrastructure company such as an electric utility might have a chance of spotting and reverse-engineering an attack on its everyday office computers thanks to conventional computer security software, crucial information about how hackers exploited, say, the industrial control software running the power grid would be all but impossible to gather, says Weiss. That means that in the event of an attack on a power grid or water plant, it could take some time to determine whether it was an accident or not, he says: “You can’t hide the lights going off, but you can sure be in a position to not know it was cyber that caused it.”
Weiss cites a power outage that affected three million people in Florida in 2008 as an example. The incident was eventually traced to an error by one employee who disabled two protection systems. “The only difference between it being malicious or unintentional was the motivation of that one guy,” says Weiss.
He sees little prospect of anything changing quickly, because the design of power grids evolves slowly. “How do you secure a system that cannot be upgraded for security and will not be replaced in years?” he says. “You can’t do to these systems what you would do in the IT world.” The U.S. is far from alone in facing such questions, says Weiss, because only a handful of manufacturers supply the control systems and grid hardware to countries around the globe.
Although Panetta asserted last week that the Defense Department had made “significant investments in forensics to address this problem of attribution” and is “seeing the returns on that investment,” the Pentagon lacks a way to find out quickly if U.S. infrastructure is under attack. It can’t simply peer inside the networks of every power grid operator the way radar stations can scan the skies. Panetta grumbled that Congress was holding up progress by failing to pass the Cybersecurity Act of 2012. It would create programs encouraging critical infrastructure companies to share data from their networks with the Pentagon for the purpose of detecting hacking attacks in progress.
The U.S. Chambers of Commerce helped lead a corporate campaign that shut down that bill. Chris Blask, founder and CEO of ICS Cybersecurity, says the companies involved worried that handing over information from their internal systems would be costly and could reveal things that led to liability claims or regulatory judgments. Blask, who is also chair of the Industrial Control System Information Sharing and Analysis Center, a public-private body trying to set up procedures for sharing information about cyber security, says anonymizing what companies share could make Panetta’s approach more palatable: “There’s a difference between sharing the knowledge that something happened to a particular type of system and sharing raw data.”
The government’s ability to detect and respond to attacks on infrastructure is also complicated by the different roles, responsibilities, and technological capabilities of the government agencies involved. While the Pentagon and the National Security Agency are believed to have the best technology and intelligence on computer security threats, they are supposed to protect only military computer and communications networks, known in government terminology as “.mil.” Responsibility for protecting “.com”—private companies and civilian networks, including power-grid operators and the like—falls on the Department of Homeland Security, while the Department of Energy has some responsibility for power infrastructure.
The Defense Department is developing new rules of engagement for cyberspace, and Panetta said last week that they would enshrine the Pentagon’s role in defending not only military networks but the United States as a whole, laying out the permitted responses to particular types of incidents.
Such plans can’t be directly modeled on traditional warfare, says Joseph Nye, a professor at Harvard’s John F. Kennedy School of Government. “The big difference is if the government sees an impending cyber attack, they only have 300 milliseconds to respond,” he says. “If it were a ‘normal’ war, you might have time to convene a meeting at the White House. In the cyber world, you don’t.”
When designing an embedded system choosing which tools to use often comes down to building a custom solution or buying off-the-shelf tools.