skip to main content

→ Top Stories:
Fracking
Safe Chemicals
Defending the Clean Air Act

Laurie Johnson’s Blog

Climate disruption: the 10 percent doctrine?

Laurie Johnson

Posted June 19, 2013 in Curbing Pollution, Solving Global Warming

Tags:
, , , , , , , , , , ,
Share | | |

What do the odds of a catastrophe have to be for insurance to be a wise investment? Apparently extremely low for a father or mother buying life insurance. For example, the probability of dying at age 35 is 0.1 percent. The odds of both parents dying are exponentially smaller. Yet, against these probabilities, parents routinely spend thousands of dollars on this insurance.

Why, then, haven’t we made aggressive investments in climate protection, when the odds of a catastrophe and the number of lives at stake appear to be far larger? One prominent climate economics model (more below) estimates a 10 percent chance of catastrophe if global temperatures increase more than 2°C above preindustrial levels, yet the world’s lack of serious mitigation efforts is putting the world on track for 3.6°C to 5.3°C (p. 9).

In a recent PBS column, reporter Paul Solman asked one very prominent economist to speculate on this dilemma. Martin Weitzman, economics professor at Harvard (and formerly Yale and MIT), presented the economics of uncertainty in no uncertain terms:

  • [W]e are undertaking a colossal planet-wide experiment of injecting CO2 into the atmosphere that goes extraordinarily further and faster than anything within the range of natural CO2 fluctuations for tens of millions of years…[I]n this kind of situation, for an economist, abating CO2 emissions is like buying insurance against a catastrophe…The bottom line is that if we continue on a business-as-usual trajectory, then there is some non-trivial probability of a catastrophic climate outcome materializing at some future time…. If we don't start buying into this insurance policy soon, the human race could end up being very sorry should a future climate catastrophe rear its ugly head.

These are strong words for an economist: the famous joke about economists is that you will never find a one-handed one (on the one hand…).

The model I referred to above is one of the most popular ones used in the economics literature and by policy makers: PAGE (Policy Analysis of the Greenhouse Effect, developed by Chris Hope at the University of Cambridge and used by, among others, the UK and the US to estimate the benefits of emissions reduction policies).

PAGE’s assumptions are based upon the overwhelming scientific consensus that an increase above 2°C puts the climate at risk of reaching a “tipping point,” where irreversible catastrophic damages could unfold. Corresponding to the 2°C mark is an atmospheric CO2 concentration level of 450 ppm. PAGE assumes an increasingly higher chance of a catastrophic event the larger the temperature increase and, associated with these rises, increasingly higher economic damages.

Of course, passing 450 ppm doesn’t mean there will be a catastrophe. We can only know that the risk increases the more CO2 exceeds that level. But the business-as-usual scenario is of no comfort. In 2001, the Intergovernmental Panel on Climate Change presented a range of approximately 550 to 1000 ppm by 2100; by 2010, CO2 emissions reached levels consistent with the higher end projections. To put this in context, consider the projections relative to historical fluctuations:

 Thumbnail image for 2001 IPCC--ppm history and future.png 

FIGURE DESCRIPTION: The horizontal axis goes back in time starting at 0 at the right end, which is the present, going back 600,000 years, from two different ice core records (EPICA and Vostok). The small red bar at the side on the right vertical axis indicates the increase in CO2 concentrations between 1958 and 2007. On this time scale, the 50 years of measurements span less than the thickness of the line, so it appears vertical, as do projections to 2100 (the six arrows above the 2007 mark).

Any rational person should find this graph alarming: pre-industrial levels were around 300 ppm or less, and the natural range between ice ages and warm periods like ours is about 100 ppm. This year, we blew past a milestone of 400 ppm and, if we do nothing, risk reaching 1,000 ppm. For Weitzman, the huge uncertainties associated with this experiment don’t preclude a certain policy choice:

  • Admittedly, almost all of the relevant probabilities in this kind of rough analysis are uncomfortably indeterminate. But that is the nature of the beast here and shouldn't be an excuse for inaction...Prudence would seem to dictate taking action to cut back greenhouse gas emissions significantly.

Weitzman's reasoning is not unlike the famous wager proposed by the 16th century Catholic French philosopher Blaise Pascal, which goes something like this: “Given the possibility that God actually does exist, and the infinite loss associated with non-belief (eternal damnation), a rational person should live as though God exists. If God does not actually exist, such a person will have only a finite loss (some pleasures, luxury, etc.). An infinite cost times even a tiny probability is still ... an infinite cost.”[1]

Replace “God” with “catastrophic climate risk” in the previous paragraph.

Another analogy is the famous “One Percent Doctrine,” coined from Dick Cheney’s assessment of the risks of terrorism:

If there's a 1% chance that Pakistani scientists are helping al-Qaeda build or develop a nuclear weapon, we have to treat it as a certainty in terms of our response. It's not about our analysis ... It's about our response.”

One might dismiss Cheney’s assessment on grounds that they disagree with his anti-terrorism policies, but his statement reflects the nation’s longstanding approach to national security—we spend a huge amount of resources against unknown risks. Every year, the US military budget is well over $600 billion dollars.

One percent is a small number, but if that is the risk of a catastrophe it makes sense to invest a great deal to prevent it. A ten percent risk of catastrophe is unacceptable. Yet, by one estimate, the Federal government spent only $25 billion on low-emission technologies in 2010 (excluding short-term stimulus).

Instead, we should be taking aggressive actions, like those outlined by the International Energy Agency, to prevent CO2 from exceeding 450 ppm. Time is running out fast: anything built from now on that produces carbon will do so for decades, producing a “lock-in” effect that will be the single factor most likely to produce irreversible climate change. If we don’t stop locking in high carbon emissions within the next five years the results are likely to be disastrous.

 

 

 


[1] This paraphrase is adapted from Solman’s piece and Wikipedia’s description.

Share | | |

About

Switchboard is the staff blog of the Natural Resources Defense Council, the nation’s most effective environmental group. For more about our work, including in-depth policy documents, action alerts and ways you can contribute, visit NRDC.org.

Feeds: Laurie Johnson’s blog

Feeds: Stay Plugged In