Blog post

Blogs review: the discounting debate in climate change mitigation

What’s at stake: Decisions with respect to climate change action depend on various parameters, but a particularly important one is the choice of the s

Publishing date
20 April 2012

What’s at stake: Decisions with respect to climate change action depend on various parameters, but a particularly important one is the choice of the social discount rate (SDR), which captures – among other things – the weight at which we discount the welfare of future generations. This discounting decision has direct implications as to whether it is necessary to delay or accelerate climate change mitigation policies. In discussions about the choice of the appropriate SDR, two main approaches have emerged: an a priori approach (as proposed by Stern) and a market based approach (as proposed by Nordhaus and Weitzman).

Standard Cost-Benefit Analysis

The 2006 Stern Review on the Economics of Climate Change broke down the social discount rate into three components: the pure time discount rate (d), the elasticity of the social marginal utility of consumption with respect to consumption (h), and the growth rate of technology (g): SDR = d + hg. In terms of numeric values, Stern sets d=0.001, implying a 0.1% chance of human extinction each year. Stern thus basically treats current and future generations equally; the only difference between the two arises from a small possibility of extinction. Stern then sets h=1, which implies that the elasticity of marginal consumption between better-off and worse-off individuals is equal to the ratio of their wealth. A high value of h means that transfers of wealth from rich to poor are endorsed, while a low h implies little concern about distributional equality. Stern then goes on to make a positive assumption about the growth rate of technology and sets g=1.3%. This leads to an SDR of 1.4%. With this value, urgent action against climate change is warranted. The practical policy implication Stern calls for is an investment of 1% of global GDP in the fight against CO2 emissions, which would stabilize global stocks of GHG below the 550ppm threshold.

Martin Weitzman – of Harvard University – argues in the Journal of Economic Literature that the conclusions of the report rely on parameter values that are imposed a priori rather than chosen in a democratic manner. The author argues that it is possible to come up with more reasonable numbers based on observed behavior in the marketplace. One option would be an SDR of 6%, obtained by setting d=2%, h=2, and g=2%. He further suggests that the general uncertainty surrounding the topic warrants a higher SDR. This need not be 6, as in the previous example and could also lie in the 2-4 range. This would create a middle-ground between Stern’s recommendations and more skeptical views according to which climate change action should be delayed.

William Nordhaus argues that the findings of Stern – that “if we don’t act, the overall costs and risks of climate change will be equivalent to losing at least 5% of global GDP each year, now and forever. If a wider range of risks and impacts is taken into account, the estimates of damage could rise to 20% of GDP or more” – differ markedly from economic models that calculate least-cost emissions paths to stabilize concentrations or paths that balance the costs and benefits of emissions reductions. Mainstream economic models definitely find it economically beneficial to take steps today to slow global warming, but efficient policies generally involve modest rates of emissions reductions in the near term, followed by sharp reductions in the medium and long term. Nordhaus suggests taking an h=3 and keeping d close to 0. This, he argues, would lead to real returns and savings rates closer to values observed in the economy. Stern’s findings rely on a low discount rate and low inequality aversion, which results in savings rates and real returns which are very different from actual economic values. In a recent article for the NYT Review of Books, Nordhaus argues in favor of immediate moderate action, via cap-and-trade or carbon taxes.

Partha Dasgupta – of Cambridge University – is suspicious as to whether Stern’s theory remains defensible in the face of small changes in parameter values. If a g=0 were to come about (while keeping d and h fixed at the values chosen by Stern), Dasgupta argues that current generations ought to save 97.5% of current production for future generations. He thus accuses Stern’s theory of exhibiting normative bias. Brad DeLong responds to Dasgupta’s assessment. DeLong considers the arithmetic underlying Dasgupta’s conclusion and finds that a savings rate of 22.5%, not of 97.5 %, is in fact required. This renders Stern’s prescription more reasonable. DeLong also advocates a d=0, which implies that current and future generations carry equal weight in climate change mitigation. DeLong goes on to express his agreement with Dasgupta with regards to the importance of the h parameter. He suggests that different hs, in the range of 1 to 5, should be considered and that this should have also been the approach of the Stern Review.

Richard Tol does a meta-analysis of the social cost of carbon. The author finds that there is a downward trend in the estimates of the social cost of carbon and that the Stern Review is an outlier. Its impact estimates are pessimistic even when compared to other studies in the gray literature and other estimates that use low discount rates. The uncertainty about the social cost of carbon is so large that the tails of the distribution may dominate the conclusions.

Fat tails and the economics of climate change

In a recent article, Martin Weitzmann argues that the most striking feature of the economics of climate change is that its extreme downside is nonnegligible and that the way in which this deep uncertainty is conceptualized and formalized influences the outcomes of any reasonable cost-benefit analysis (CBA) of climate change. The author does not argue that the standard model is wrong or even implausible, but rather that it may not be robust with respect to the modeling of catastrophic outcomes.

He suggests that the unprecedented scale and speed of GHG increases brings policy makers into uncharted territory, rendering climate predictions very difficult. Current CBA models assume that the climate will respond in a certain way to increased GHG emissions and that this response will follow a distribution with thin tails. This implies very little concern for potentially extreme outcomes. Weitzman points out that modeling tail probabilities one way or the other can have huge consequences. He contrasts a Pareto distribution with a normal distribution, assuming that the latter is usually at the heart of CBA models. Carrying out simulations using the two leads to extremely different estimates, particularly regarding the reactiveness of high-temperature probabilities to the level of GHGs.  He concludes that the focus should be shifted away from central tendencies and towards extreme tails.

William Nordhaus provides an assessment of Weitzman’s initial views on this topic. He particularly focuses on Weitzman’s point that under certain strict assumptions about the structure of uncertainty and preferences, “society has an indefinitely large expected loss from high-consequence, low-probability events” (the ‘dismal theory’). Nordhaus stresses that the theory is only applicable under very limited conditions involving strong risk aversion, a fat tail for the uncertain variables and an inability on society’s part to act in time against climate change. He finds nonetheless that tail events deserve attention and careful analysis and acknowledges Weitzman’s merit in drawing attention to the existence of deep uncertainties. He goes on to suggest that it is unclear when cost-benefit analysis should or should not be employed. He concludes that this decision will rest largely on the “the uncertainty surrounding specific issues and phenomena, as well as attitudes toward risk.”

About the authors

  • Jérémie Cohen-Setton

    Jérémie Cohen-Setton is a Research Fellow at the Peterson Institute for International Economics. Jérémie received his PhD in Economics from U.C. Berkeley and worked previously with Goldman Sachs Global Economic Research, HM Treasury, and Bruegel. At Bruegel, he was Research Assistant to Director Jean Pisani-Ferry and President Mario Monti. He also shaped and developed the Bruegel Economic Blogs Review.

Related content

Blog post

The fiscal stance puzzle

What’s at stake: In a low r-star environment, fiscal policy should be accommodative at the global level. Instead, even in countries with current accou

Jérémie Cohen-Setton
Blog post

The state of macro redux

What’s at stake: In 2008, Olivier Blanchard argued in a paper called “the state of macro” that a largely shared vision of fluctuations and of methodol

Jérémie Cohen-Setton