Their approach is straightforward. They take the total number of papers cited by researchers from a particular city and then count how many of these appear in the top ten per cent of cited papers. By the law of averages, you’d expect ten per cent of these papers to appear in the top ten per cent.
“For example, if authors located in one city have published 10,000 papers, one would expect for statistical reasons that a thousand (that is, 10%) belong to the top10% most highly cited papers,” say Bornmann and Leydesdorff.
They then compare the expected number of top papers from a city with the actual number.
Finally, they plot the results on a map, showing cities that have more than expected highly cited papers in dark green and those with fewer than expected in red. The bigger the dots, the more papers that are involved.
Bornmann and Leydesdorff have done this for physics
papers that appeared on Scopus in 2008 with the citations up until February 2011. The screen shot above shows the physics papers map.
The results for physics indicate that the best performaners are London, Paris, Karlsruhe, Munich (and Garching), Pisa, and Rome. And the top result comes from London, which has more than three times more highly cited papers than expected (46 v 14.3).
The worst performer is Moscow which has only 21 highly cited papers compared to an expected value of 78.7. Bornmann and Leydesdorff also highlight the performance of Cambridge in the UK which merely matched expectations, producing 21 highly cited papers compared to the expected number of 21.7.
Bornmann and Leydesdorff’s maps raise a number of questions. Not least of these is the performance of Cambridge, MA, home to two of the world’s top institutions in MIT and Harvard, which could reasonably be expected to feature strongly in the data. Yet, Cambridge, MA, does not appear at all.