Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo


Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

I have been thinking about risk.

As I write this column in early June, British Petroleum is still struggling to contain its leaking well in the Gulf of Mexico. After the company’s drilling rig, Deepwater Horizon, exploded on April 20, as many as 19,000 barrels of oil (or as much as 800,000 gallons) spewed into the Gulf every day. A cap is now capturing a little more than 400,000 gallons a day. It is the worst environmental disaster in the history of the United States, but there may be no solution until BP completes two “relief wells” in August.

In this issue of Technology Review, David Talbot writes about the increasing incidence of cyber crime and espionage, and the real (if still speculative) risk of outright cyber warfare. In “Moore’s Outlaws,” he quotes Stewart Baker, the former general counsel of the National Security Agency and a former policy chief at the U.S. Department of Homeland Security: “What we’ve been seeing, over the last decade or so, is that Moore’s Law is working more for the bad guys than the good guys. It’s really ‘Moore’s outlaws’ who are winning this fight. Code is more complex, and that means more opportunity to exploit the code. There is more money to be made in exploiting the code, and that means there are more and more sophisticated people looking to exploit vulnerabilities. If you look at things like malware found, or attacks, or the size of the haul people are pulling in, there is an exponential increase.”

Talbot describes experts’ concerns that computer viruses have made millions of machines into “enslaved armies”–botnets–awaiting instruction by malefactors. In the days leading up to April 1, 2009, a worm called Conficker was expected to receive an update from its unknown creator, but no one knew what: “A tweak to Conficker’s code might cause the three million or so machines … to start attacking the servers of some company or government network, vomit out billions of pieces of spam, or just improve the worm’s own ability to propagate.” It’s scary stuff.

In the first case, a complex system of technologies (whose purpose is to extract crude oil five miles under the ocean’s surface) failed; in the second, a more complex system (a global computer network whose purposes are incomprehensibly various, but upon which our technological civilization depends) is failing. These failures are not so much predictable as unsurprising. We expanded our use of vulnerable technologies, because we were dependent upon them. How should we think about the risks inherent in technologies, particularly new technologies?

0 comments about this story. Start the discussion »

Credit: Mark Ostow

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

Jason Pontin Editor

View Profile »

From the Archives


Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me