Blog post

Traditional vs. Modern Macro

Publishing date
20 October 2011

What’s at stake: Larry Summers did not make a lot of friends earlier this year when he said that DSGE models played no role at all in the White House policy response to the crisis. That it was all IS-LM augmented by a liquidity trap. Since then, a series of debates has taken place on the econ blogosphere on the role of intermediate macro models, the usefulness of DGSE models, and the merits of calibration.

The IS-LM ghost

Paul Krugman, Greg Mankiw, and Brad DeLong have been strong defenders of the IS-LM framework as an important tool to give the intuition of how the economy works in the short run, a useful teaching device to understand the interaction of markets and a way for politicians to think simply about the options in a liquidity trap.

Tyler Cowen
provides, on the other hand, six reasons to dislike the IS-LM model even as a crude model to think about the economy:  it leaves on the side the difference between real and nominal interest rates, it fudges the distinction between short and long term interest rates, it conveys a too large importance to money as a non-interest bearing asset, which it in reality is not so crucial, it ignores expectations (in a separate post, he quotes the recent Nobel prize laureate on this drawback). Finally, the LM curve – which is supposed to represent a reaction function of the Central Bank – does not match the real workings of monetary policy. Matt Rognlie also emphasized this last point in a series of blog posts.

For Mark Thoma, the bottom line is easy, some questions are easier to answer using the IS-LM model (e.g. see here), some with the IS-MP model – the framework developed by David Romer that replaces the LM curve with an interest-rate rule (in both cases, coupled with a model of AS), but in general there is no truly substantive debate here. These two models are alternative presentations of the same set of ideas.

Paul Krugman
has recently been blogging about the ongoing graduate macroecoeconomic history course by the Romers to discuss the evidence on the impact of macroeconomic policies. Interestingly, the blogosphere has not, however, yet caught-up with the undergraduate version of this course where the authors extend David Romer’s intermediate macroeconomic framework to include a liquidity trap and financial intermediation (following Michael Woodford’s JEP paper). In summarizing how David Romer derives the IS-MP model from a micro funded model in his graduate level macroeconomics book, Mark Thoma furthermore outlines the complementarity of the traditional and modern approaches to Keynesian economics.

Stephen Williamson
argues, however, that a more realistic description of the banking sector and central bank operations does not the model more relevant as the liquidity trap considered is a “grandma’s liquidity trap”: when currency and liquidity were synonyms. But today, other liquid assets exist.

Slapping DSGEs

John Kay argues in an essay published by INET that that economists have been led astray by excessive reliance on formal models derived from assumptions that bear too little similarity to the world we live in. Kay does not deny the usefulness of models as such. He admits that there are proper uses of models and of mathematical reasoning, yet argues that economists too often employ models of the wrong sort, computer simulations of “artificial worlds” that are mistaken for literal descriptions of reality.

Although challenging many of the points made by John Kay, Michael Woodford agrees that much model-based economic analysis imposes a requirement of internal consistency that is unduly strong, and that may result in unnecessary fragility of the conclusions reached. It has been standard for at least the past three decades to use models in which not only does the model give a complete description of a hypothetical world, and not only is this description one in which outcomes follow from rational behavior on the part of the decisionmakers in the model, but the decisionmakers in the model are assumed to understand the world in exactly the way it is represented in the model. The macroeconomics of the future, he believes, will still make use of general-equilibrium models in which the behavior of households and firms is derived from considerations of intertemporal optimality, but in which the optimization is relative to the evolving beliefs of those actors about the future, which need not perfectly coincide with the predictions of the economist’s model.

Progress in DSGE models: financial frictions, ZLB, and unemployment

The ongoing great recession was a stark reminder that financial frictions are a key driver of business cycle fluctuations. And DSGE models came under heavy criticisms partly because they tended to assume away these frictions in the years ahead of the crisis.
Following the Princeton tradition of incorporating financial frictions in macroeconomic models, Markus Brunnermeier and Yuliy Sannikov organized recently a 3 days workshop for PhD candidates on this issue. Presentations by Chris Sims, Nobuhiro Kiyotaki, and Hyun Shin (and hopefully soon the videos) are now available online. Brunnermeier and al. have a useful background paper that surveys the macroeconomic implications of financial frictions. This survey’s focus is on the macroeconomic implications of financial frictions. The paper covers the literature on the following themes: persistence, amplification, instability; credit quantity constraints through margins; the demand for liquid assets; and the role of financial intermediation.

Another criticism often addressed to DSGE models is that they do not incorporate a theory of unemployment. As an illustration of this, the term unemployment cannot be found in the index of Michael Woodford and Carl Walsh’s books – two classic books on New Keynesian macroeconomics. Jordi Gali has taken up this task in a new book that aims at incorporating unemployment in a fully-fledged DSGE framework. The model reformulates the model of Erveg, Henderson and Levin (2000) of sticky prices and wages but assumes instead that the margin of adjustment is on the extensive margin of labor (unemployment vs. employment) rather than the intensive margin (number of hours worked per employee). In this model, unemployment fluctuations are reflected in fluctuations in the wage mark-up from its desired level. Based on this framework, Gali derives a measure of the output gap based on observable variables and argues that an interest rule targeting the unemployment rate and inflation outperforms other policy rules. Jordi Galí, Frank Smets, Rafael Wouters then incorporates this reformulated labor market block in a medium scale version – read a version close to the ones used in central banks – of the workhorse New Keynesian model.

Marco Del Negro, Gauti Eggertsson, Andrea Ferrero, Nobuhiro Kiyotaki
have what a post-crisis DGSE model is now likely to look like. The model incorporates financial frictions and a liquidity trap in a DSGE model with nominal and real rigidities. This framework allows the author to ask the following questions: Can a shock to the liquidity of private paper lead to a collapse in short-term nominal interest rates and a recession like the one associated with the 2008 U.S. financial crisis? Once the nominal interest rate reaches the zero bound, what are the effects of interventions in which the government exchanges liquid government assets for illiquid private paper? Using this framework, the authors argue that the effects of liquidity intervention can be large, and show some numerical examples in which they actually prevented a repeat of the Great Depression in 2008-2009.

For those interested in learning the basics of DSGE models, David Romer has a great chapter in the new edition of his graduate textbook. The treatment is rigorous but easier than in Gali (2008), Woodford (2003) or Walsh (2010) and can therefore serve as a useful starting point in this literature. The bulk of the chapter extends the analysis of the microeconomic foundations of incomplete nominal flexibility to dynamic settings. It then presents an example of a complete DSGE model with nominal rigidity. The model is the canonical three-equation new Keynesian model of Clarida, Galí, and Gertler (2000). Unfortunately, in many ways this model is closer to the baseline real-business-cycle model than to our ultimate objective: much of the model’s appeal is tractability and elegance, not realism. The last section discusses elements of other DSGE models with monetary non-neutrality. Because of the models’ complexity and the lack of agreement about their key ingredients, however, it stops short of analyzing other fully specified models.

Calibration and Estimation

Paul Krugman think that calibration is an acceptable practice, as long as you keep your perspective. In one of his home fields, international trade, there’s fairly widespread use of “computable general equilibrium” models — fully internally consistent models about how prices and quantities fit together, with some parameters that are guesstimates based on the literature, and others that are tweaked so as to match actual trade flows in some base period. These models are then used for “what-if” analyses, especially the effects of possible changes in trade policy. The models clearly aren’t literally true, and in no sense are you testing your theory. What you’re basically doing is elaborate thought experiments that are somewhat disciplined by the data, and which you hope are more informative than just plain guesses.The point is that if you have a conceptual model of some aspect of the world, which you know is at best an approximation, it’s OK to see what that model would say if you tried to make it numerically realistic in some dimensions. It is true that a calibration exercise is informative when it fails: if there’s no way to squeeze the relevant data into your model, or the calibrated model makes predictions that you know on other grounds are ludicrous, something was gained.

Stephen Williamson
writes that part of what the calibration people were reacting to, was a view among econometricians that quantitative work was about "testing" theories. The problem is that any macroeconomic model is going to be wrong on some dimensions. To be useful, a model must be simple, and simplification makes it wrong in some sense. Subjected to standard econometric tests, it will be rejected. Models that are rejected by the data can nevertheless be extremely useful. I think that point is now widely recognized, and you won't find strong objections to it, as you might have in 1982.

For more on this issue, we recommend this video by Chris Sims, the 2011 Nobel Prize in Economics, on calibration, statistical inference and structural change given a few ago at the INET pow wow. Mark Thoma has an easy one-page introduction to VARs for those in need of catching-up and who never really understood how these things work.

*Bruegel Economic Blogs Review is an information service that surveys external blogs. It does not survey Bruegel’s own publications, nor does it include comments by Bruegel authors.

About the authors

  • Jérémie Cohen-Setton

    Jérémie Cohen-Setton is a Research Fellow at the Peterson Institute for International Economics. Jérémie received his PhD in Economics from U.C. Berkeley and worked previously with Goldman Sachs Global Economic Research, HM Treasury, and Bruegel. At Bruegel, he was Research Assistant to Director Jean Pisani-Ferry and President Mario Monti. He also shaped and developed the Bruegel Economic Blogs Review.

Related content

Blog post

The fiscal stance puzzle

What’s at stake: In a low r-star environment, fiscal policy should be accommodative at the global level. Instead, even in countries with current accou

Jérémie Cohen-Setton
Blog post

The state of macro redux

What’s at stake: In 2008, Olivier Blanchard argued in a paper called “the state of macro” that a largely shared vision of fluctuations and of methodol

Jérémie Cohen-Setton