Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

Earthquakes are seemingly random events that are hard to predict with any reasonable accuracy. And yet geologists make very specific long term forecasts that can help to dramatically reduce the number of fatalities.

For example, the death toll from earthquakes in the developed world, in places such as Japan and New Zealand, would have been vastly greater were it not for strict building regulations enforced on the back of well-founded predictions that big earthquakes were likely in future.

The problem with earthquakes is that they follow a power law distribution–small earthquakes are common and large earthquakes very rare but the difference in their power is many orders of magnitude. 

Humans have a hard time dealing intuitively with these kinds of statistics. But in the last few decades statisticians have learnt how to handle them, provided that they have a reasonable body of statistical evidence to go on.  

That’s made it possible to make predictions about all kinds of phenomena governed by power laws, everything from earthquakes, forest fires and avalanches to epidemics, the volume of email and even the spread of rumours. 

So it shouldn’t come as much of a surprise that Aaron Clauset at the University of Colorado in Boulder and the Santa Fe Institute in New Mexico and Ryan Woodard at ETH, the Swiss Federal Institute of Technology in Zurich have used this approach to study the likelihood of terrorist attacks. 

These guys say there is a puzzle associated with 9/11. The death toll from these attacks is six times larger than the next largest attack in a database of terrorist incidents stretching back to 1968. 

That raises an curious statistical question. “Given their severity, should these attacks be considered statistically unlikely or even outliers?” ask Clauset and Woodard. 

From a practical point of the answer seems obvious. Numerous building complexes house tens of thousands of people and sporting events regularly gather upwards of 50,000 people into an area not much bigger than a football field. So it’s not hard to imagine an attack causing many more deaths. By that measure, a catastrophic terrorist attack is by no means unthinkable.  

But the question that Clauset and Woodard ask is whether such thinking is justified by the statistics of terrorist attacks. After all, there may be a mechanism, such as increased security at big events, that prevents these kinds of attacks.

So these guys have uses a dataset of over 13,000 terrorist events between 1969 and 2007 to calculate the likelihood of an attack with a death toll equivalent or greater than 9/11. 

These calculations are complex because you first have to decide what kind of statistics describes the distribution of attacks in the past. 

Clauset and Woodard make three different estimates based on various possible distributions, such as power law, log-normal and stretched exponetional distributions.

They calculate that  the chances of at least one 9/11 event at any time in the last 40 years is between 11 and 35 per cent. 

That’s important. It means that 9/11 itself was not at all unlikely given the pattern of terrorist activity leading up to it.

Clauset and Woodard then use the same method to make a prediction about the future. Given the pattern of terrorist behaviour in the past, how likely is another 9/11-type event in the next 10 years?

Assuming that the number of terrorist events per year remains the same as it is now, about 2000 per year, then the likelihood of another 9/11 is between 20 and 50 per cent, depending on the choice of distribution.. 

The 50 per cent prediction comes from the power law distribution which many experts argue is a good fit to the data. By that measure, a catastrophic attack is as likely as not.

Of course, conditions might change. The current level of terrorist incidents is heavily influenced by the number of attacks in Iraq and Afghanistan. It’s possible to argue that the numbers will drop in the near future as these regions become more stable. 

In that case, the likelihood of a 9/11-type attack in the next ten years drops to between 5 and 20 per cent, say Clauset and Woodard. 

But it takes a brave observer to make any kind of prediction about the stability of these regions. The number of attacks could increase because of any number of factors, such as an increase in food prices

So Clauset and Woodard also calculate the likelihood of a 9/11-type event in this pessimistic scenario. The results make for frightening reading. In this case, the power law model predicts that an attack with a death toll greater than 9/11 is a 95 per cent certainty.

That’s something worth considering in a lot more detail. 

Ref: arxiv.org/abs/1209.0089: Estimating The Historical And Future Probabilities Of Large Terrorist Events

19 comments. Share your thoughts »

Tagged: Communications

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me