Major reports are concluding that stabilizing greenhouse-gas emissions to avoid catastrophic climate change is possible and can be done at a relatively low cost. But the details of the reports make it clear that when you factor in real-world issues—such as delays in developing and implementing technology and policy—the cost of solving climate change gets much higher. Switching from fossil fuels to low-carbon sources of energy will cost $44 trillion between now and 2050, according to a report released this week by the International Energy Agency. That sounds like a lot of money, but the report also concludes that the switch to low-carbon technologies such as solar power—together with anticipated improvements in efficiency—will bring huge savings from reduced fossil-fuel consumption. As a result, the world actually comes out slightly ahead: the costs of switching will be paid for in fuel savings between now and 2050.
Last month a major report from the U.N.’s Intergovernmental Panel on Climate Change said that efforts to stabilize levels of greenhouse-gas emissions would require investments of about $13 trillion through 2030. It also noted that reducing emissions would reduce the rate of economic growth (as a result of such factors as higher energy prices). But it would do so by, on average, less than a tenth of a percentage point per year between now and 2100.
These cost estimates, however, are based on idealized scenarios. They give a sense of what getting away from fossil fuels will cost if we all act now and make smart decisions going forward, and if technologies work out the way we hope they will. One of the biggest factors is how long it takes to start reducing emissions. In 2012, the IEA estimate for the cost of switching to low-carbon energy was only $36 trillion, $8 trillion less than the current estimate. The increase is largely because in the intervening time, emission rates have increased and greenhouse-gas levels in the atmosphere have risen, making the problem harder to solve. The IPCC report showed that continuing to hold off on reducing emissions could increase costs by 40 percent if the delay leaves emissions 50 percent higher in 2030 than they are in ideal scenarios.
Aside from delays in action, many other factors will increase costs. Costs will go up if countries don’t all work together. They’ll also increase if technologies don’t work as expected. The most glaring example has to do with technology for capturing and storing carbon dioxide. According to the IPCC, if this technology can’t be deployed, the cost of stabilizing greenhouse-gas levels will more than double (see “The Cost of Limiting Climate Change Could Double without Carbon Capture Technology”).
Robert Pindyck, a professor of economics and finance at MIT, says that attempts to make decisions about climate change based on a cost-benefit analysis are doomed to fail because both costs and benefits are uncertain. “All we can do is speculate,” he says. “We don’t really know the costs. We don’t really know the benefits.” He says, however, that the chance of a catastrophic outcome should be enough to motivate investment to avert climate change even in the face of uncertainty, just as people buy health insurance without knowing if it will pay off.
Although actual costs can’t be predicted with precision, cost estimates like the ones from the IPCC and IEA do have an important role: they can tell policymakers what to focus on. Climate negotiators have known for some time that acting quickly is important, and the reports make this even more clear by showing just how much delays can add to costs. The data also help suggest which technologies might need more attention. The IPCC found, for example, that not having CCS could raise costs far more than, say, limiting the amount of solar power we put on the grid, suggesting that efforts on CCS should be given a high priority.