Skip to Content

Climate Change: "Many, Many Smoking Guns"

The fourth in a series of international reports, due out tomorrow, reflects better science and more-precise models.
February 1, 2007

When the UN-organized Intergovernmental Panel on Climate Change (IPCC) presents its projections for global warming and future climate changes tomorrow, the report’s hallmark will be a far greater level of certainty and precision than what was expressed in the last IPCC report, issued in 2001. “The certainty is huge,” says Andrew Weaver of the University of Victoria in British Columbia, Canada’s top climate modeling expert and a coauthor of the new IPCC report.

To be sure, continued warming observed since 2001 is part of that certainty. But climatologists say the bigger factor is the broad accumulation of science over the past six years that has increased the precision with which climate models predict future climate change, debunked alternative hypotheses advanced by skeptics, and identified the footprint of man-made climate change in every corner of the earth.

As Weaver put it, the IPCC has not just found a smoking gun linking greenhouse gas emissions and climate change. Rather, its fourth report delivers a smoking arsenal. “There are many, many smoking guns,” he says. “It’s a battalion of smoking intercontinental ballistic missiles.”

Jerry Mahlman, a senior research associate at the U.S. National Center for Atmospheric Research (NCAR), in Boulder, CO, and a peer reviewer for the IPCC, says that while the final language is still being hammered out, the report might end up expressing 99 percent certainty that greenhouse gas emissions, primarily from burning fossil fuels, are warming Earth, up from “greater than 90 percent” confidence in the previous report. “It’s very obvious that the earth is warming up exactly as we’ve projected it to do so,” says Mahlman. One recent draft notes that IPCC’s projection in 1990 that global average temperature would rise by between 0.15 and 0.3 °C per decade through 2005 compares well with the 0.2 °C increase that actually occurred.

Today, many detailed scientific reports are detecting global warming’s fingerprints rather than simply glimpsing the outline of its footprint. The second and third IPCC assessments, issued in 1996 and 2001, respectively, built a case for man-made climate change on increased global average temperature above that expected from natural variability. Weaver says the fourth report, in contrast, will identify the signal of man-made climate change in every region of the globe and in many more variables beyond temperature, such as increases in intense tropical cyclones and forest fires.

“We’re finding the signal of climate change in more and more places,” says Stanley Solomon, a scientist with NCAR’s Earth & Sun Systems Lab. For example, last year Solomon published the first definitive identification of man-made climate change in the thermosphere, the uppermost layer of Earth’s atmosphere. The thermosphere was, paradoxically, predicted to cool and thin with increased carbon dioxide. And that is exactly what Solomon and his colleagues found. They detected the expected cooling by noting a small but statistically significant decline in the drag on satellites traveling through the thermosphere.

The biggest challenge faced by Solomon: ruling out the effects of solar storms, which skeptics of man-made climate change have proposed might be causing global warming. Solomon and others have refuted this notion but confirmed the storms’ dominance as a temperature driver in the thermosphere. “Solar-driven changes, while they’re extremely large above 100 kilometers, get progressively smaller and smaller as you get lower into the atmosphere, and are extremely small once you reach the surface,” says Solomon. Other scientific studies have identified human influence in everything from declining mountain glaciers and snow cover in both hemispheres, to increases in ocean salinity, to the growing frequency and range of severe droughts.

And as the science has peered into more areas, the models have gotten sharper, too. New modeling techniques pioneered by the Oxford University-based program, a distributed computing effort, helped the IPCC express its predictions more precisely. The early draft of the IPCC report leaked last week predicts a far tighter range of temperature and sea-level increase over the next century than the previous IPCC report did. For example, the draft projects that sea level will gain between 28 and 43 centimeters by 2100, compared with the 9-to-88-centimeter rise forecast in 2001.

Probabilistic models are also helping the IPCC come to consensus over the most likely figure within their predicted range. In the case of temperature, the leaked draft says that if carbon dioxide concentrations stabilize at 550 parts per million, the planet will warm 2 to 4.5 °C, but it adds a “best estimate” of 3 °C, which is slightly below the mean for that range. Of course, the future trajectory of greenhouse gas emissions and consequent warming depends greatly on future energy and transportation policies and trends. Weaver estimates that uncertainty in emissions contributes about 50 percent of the uncertainty in most climate-change predictions. The rest comes from imperfect understanding of how the climate’s physical and biological components interact, such as how much cloud cover will result from changes in temperature.

What is most clear is that the increasing scientific certainty around global warming has emboldened climate scientists as much as it has isolated the small band of skeptics swimming against the scientific mainstream. “There is no credible alternative hypothesis,” says Mahlman. “It simply doesn’t exist.” Weaver predicts that the skeptics will soon fade away as funding sources increasingly pull their funding from organizations and researchers leading the anti-IPCC charge, as oil and gas giant ExxonMobil announced that it had done last month in pulling funding from the Washington-based Competitive Enterprise Institute, a conservative think tank. It was the same story with the “smoking doesn’t cause cancer” crowd, says Weaver: “Once the funding breaks down, they’ll break down.”

Keep Reading

Most Popular

This new data poisoning tool lets artists fight back against generative AI

The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models. 

Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist

An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.

Data analytics reveal real business value

Sophisticated analytics tools mine insights from data, optimizing operational processes across the enterprise.

Driving companywide efficiencies with AI

Advanced AI and ML capabilities revolutionize how administrative and operations tasks are done.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.