The most definitive scientific assessment of global warming to date, a report released earlier this month from the Intergovernmental Panel on Climate Change (IPCC), concluded with “very high confidence” that humans are contributing significantly to global warming. The report also precisely defines the scientific uncertainties concerning the extent, impacts, and timing of global warming. Ronald Prinn, professor of atmospheric chemistry at MIT and one of the lead authors of the report, says that estimating and understanding these uncertainties is key to evaluating climate data and to deciding on a course of action. Prinn, a leading climate scientist and the director of a worldwide project that carefully monitors the amounts of dozens of greenhouse gases, recently sat down with Technology Review to explain why climate-change science is uncertain, how technology is reducing that uncertainty, and what challenges remain.
Technology Review: What is the IPCC report based on?
Ronald Prinn: The IPCC asked the climate modelers to do two exercises. One, to simulate the climate that we actually have. Here are the greenhouse gases over the last 100 years. Here are the sulfate aerosols that come from coal and actually cool the system. So all of these human influences are prescribed. And then they say run your models and give us your output for global average temperature.
Then they were given a separate exercise: take the [year] 1900 greenhouse-gas levels and keep them constant for the 100 years.
At the global scale, it’s pretty clear that the results from the two exercises, at about 1960, totally diverge. If you look at that and say, Could I explain the observed temperatures with the models that don’t include human-caused increasing greenhouse gases?, the answer is one chance in ten.
TR: This confidence–that there is a 90 percent chance human influences cause global warming–is new to this report. Where does that confidence come from?
RP: It is a combination of growing volume of observations of the planet and various signals of climate change, but [it’s] also a significant component of computer modeling. Just from the observations alone, you can’t answer key questions, so you need to combine observations with computer models of the system.
In the last six years, these climate models have gained in realism; they have improved in their physics, chemistry, and biology. They’ve improved their spatial resolution because computers have gotten faster. There’s better confidence that these models are looking a bit more like the planet we live on. But they’re never going to be to the point of simulating every leaf, and so on.
[It’s also important] that there are now enough climate models being built around the world that different philosophies can be encapsulated in that.
TR: Are there technological advances that have made modeling better?
RP: The fact that computers have been getting faster and faster has been extremely important. For observing greenhouse gases, there have been big strides in the last 10 years, particularly with the introduction of automated mass spectrometers. In the last three or four years, the use of satellite remote sensing using very high-resolution spectra measured from space to deduce trace gas concentrations around the world is growing, although it has not played much of a role to this point. Looking to the future, the use of satellite remote sensing of greenhouse gases is going to certainly grow.
As far as temperature is concerned, the same story is true. We have a ground network. We have a meteorological balloon network. But particularly in the last 20 years, the measurement of temperatures from using infrared and microwave wavelengths from orbiting satellites is getting better and better. It doesn’t give the altitude resolution of a balloon rising through the air. But it gives you global coverage, whereas the previous observing systems were spotty.
TR: What uncertainties remain, and how might technology help resolve them?
RP: One [challenge] is observing the ocean below the surface. One key concern is [whether] the deep overturning of the ocean from top to bottom is going to slow down in the future. At the moment, the ocean is a sink for heat, so it’s slowing down the warming. That sink for heat is possible only because you can take surface water down to the depths and bring cold water up to replace it.
The temperature record and salinity record in the ocean is very poorly sampled because you have to lower things on cables to the bottom of the ocean and bring them back up. Cheap sensors of temperature, salinity, and depth would be a big step forward. Their initial use would be understanding the ocean as it’s working right now.
TR: One of the IPCC charts shows that there is large uncertainty about the role of clouds. Why is that?
RP: One of the big differences when you see these models differing from one another is the way they handle clouds and convection. This relates to human influence because of aerosols–things like sulfates. When you throw sulfate aerosols into a preexisting cloud, water evaporates off the droplets that are there and recondenses on the sulfuric acid. For the same volume of water in the cloud, you now produce more cloud particles, and that increases the total surface area. That means they’re more reflective. So this is a human-induced increase in the reflectivity of existing clouds.
The uncertainty is because to assess that, you have to know the distribution of these short-lived aerosols all around the world. And then you have to understand the physics and the microphysics of these clouds in detail that you just do not have at the present time. Part of the understanding of convection and clouds is also to measure the cloud properties droplet by droplet, and that requires special instrumentation and a lot of work in the lab … If you have a droplet of water and a particle of sulfuric acid, what are the things that determine the transfer of the water droplet to the particle?
TR: Can technology ever completely remove uncertainty about climate change?
RP: This is not a mathematical theorem that we can say at the end, QED. It will never be that. There will always be uncertainty, because it’s a highly uncertain system. But for the first time, [the IPCC authors] do work very, very hard to estimate the uncertainty on every key number.
TR: Given that this is not a mathematical theorem, how can you best deal with that uncertainty?
RP: Bring in lots of models, but those that you bring in have got to be credible models. One of the stipulations for getting your information into the assessment is, you have to have published it in a peer-reviewed journal, not in the Wall Street Journal op-ed or something like that. This is not credibly done with op-eds and in popular magazines. There is no legitimate role in the IPCC assessment, and for good reason, for things that are not published in the peer-reviewed literature. Some don’t like that, but that’s the nature of it, and it’s one of the rules of the game because it’s supposed to be a scientific assessment.
Nothing is purely scientific, of course, because every person has their own views and philosophies about how to do things. My own view of how to get around the uncertainties is to objectively estimate them. Do your best to estimate them at every point.
DeepMind’s cofounder: Generative AI is just a phase. What’s next is interactive AI.
“This is a profound moment in the history of technology,” says Mustafa Suleyman.
What to know about this autumn’s covid vaccines
New variants will pose a challenge, but early signs suggest the shots will still boost antibody responses.
Human-plus-AI solutions mitigate security threats
With the right human oversight, emerging technologies like artificial intelligence can help keep business and customer data secure
Next slide, please: A brief history of the corporate presentation
From million-dollar slide shows to Steve Jobs’s introduction of the iPhone, a bit of show business never hurt plain old business.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.