Blog Post

Traditional vs. Modern Macro

What’s at stake: Larry Summers did not make a lot of friends earlier this year when he said that DSGE models played no role at all in the White House policy response to the crisis. That it was all IS-LM augmented by a liquidity trap. Since then, a series of debates has taken place on […]

By: and Date: October 20, 2011 Topic: European Macroeconomics & Governance

What’s at stake: Larry Summers did not make a lot of friends earlier this year when he said that DSGE models played no role at all in the White House policy response to the crisis. That it was all IS-LM augmented by a liquidity trap. Since then, a series of debates has taken place on the econ blogosphere on the role of intermediate macro models, the usefulness of DGSE models, and the merits of calibration.

The IS-LM ghost

Paul Krugman, Greg Mankiw, and Brad DeLong have been strong defenders of the IS-LM framework as an important tool to give the intuition of how the economy works in the short run, a useful teaching device to understand the interaction of markets and a way for politicians to think simply about the options in a liquidity trap.

Tyler Cowen
provides, on the other hand, six reasons to dislike the IS-LM model even as a crude model to think about the economy:  it leaves on the side the difference between real and nominal interest rates, it fudges the distinction between short and long term interest rates, it conveys a too large importance to money as a non-interest bearing asset, which it in reality is not so crucial, it ignores expectations (in a separate post, he quotes the recent Nobel prize laureate on this drawback). Finally, the LM curve – which is supposed to represent a reaction function of the Central Bank – does not match the real workings of monetary policy. Matt Rognlie also emphasized this last point in a series of blog posts.

For Mark Thoma, the bottom line is easy, some questions are easier to answer using the IS-LM model (e.g. see here), some with the IS-MP model – the framework developed by David Romer that replaces the LM curve with an interest-rate rule (in both cases, coupled with a model of AS), but in general there is no truly substantive debate here. These two models are alternative presentations of the same set of ideas.

Paul Krugman
has recently been blogging about the ongoing graduate macroecoeconomic history course by the Romers to discuss the evidence on the impact of macroeconomic policies. Interestingly, the blogosphere has not, however, yet caught-up with the undergraduate version of this course where the authors extend David Romer’s intermediate macroeconomic framework to include a liquidity trap and financial intermediation (following Michael Woodford’s JEP paper). In summarizing how David Romer derives the IS-MP model from a micro funded model in his graduate level macroeconomics book, Mark Thoma furthermore outlines the complementarity of the traditional and modern approaches to Keynesian economics.

Stephen Williamson
argues, however, that a more realistic description of the banking sector and central bank operations does not the model more relevant as the liquidity trap considered is a “grandma’s liquidity trap”: when currency and liquidity were synonyms. But today, other liquid assets exist.

Slapping DSGEs

John Kay argues in an essay published by INET that that economists have been led astray by excessive reliance on formal models derived from assumptions that bear too little similarity to the world we live in. Kay does not deny the usefulness of models as such. He admits that there are proper uses of models and of mathematical reasoning, yet argues that economists too often employ models of the wrong sort, computer simulations of “artificial worlds” that are mistaken for literal descriptions of reality.

Although challenging many of the points made by John Kay, Michael Woodford agrees that much model-based economic analysis imposes a requirement of internal consistency that is unduly strong, and that may result in unnecessary fragility of the conclusions reached. It has been standard for at least the past three decades to use models in which not only does the model give a complete description of a hypothetical world, and not only is this description one in which outcomes follow from rational behavior on the part of the decisionmakers in the model, but the decisionmakers in the model are assumed to understand the world in exactly the way it is represented in the model. The macroeconomics of the future, he believes, will still make use of general-equilibrium models in which the behavior of households and firms is derived from considerations of intertemporal optimality, but in which the optimization is relative to the evolving beliefs of those actors about the future, which need not perfectly coincide with the predictions of the economist’s model.

Progress in DSGE models: financial frictions, ZLB, and unemployment

The ongoing great recession was a stark reminder that financial frictions are a key driver of business cycle fluctuations. And DSGE models came under heavy criticisms partly because they tended to assume away these frictions in the years ahead of the crisis.
Following the Princeton tradition of incorporating financial frictions in macroeconomic models, Markus Brunnermeier and Yuliy Sannikov organized recently a 3 days workshop for PhD candidates on this issue. Presentations by Chris Sims, Nobuhiro Kiyotaki, and Hyun Shin (and hopefully soon the videos) are now available online. Brunnermeier and al. have a useful background paper that surveys the macroeconomic implications of financial frictions. This survey’s focus is on the macroeconomic implications of financial frictions. The paper covers the literature on the following themes: persistence, amplification, instability; credit quantity constraints through margins; the demand for liquid assets; and the role of financial intermediation.

Another criticism often addressed to DSGE models is that they do not incorporate a theory of unemployment. As an illustration of this, the term unemployment cannot be found in the index of Michael Woodford and Carl Walsh’s books – two classic books on New Keynesian macroeconomics. Jordi Gali has taken up this task in a new book that aims at incorporating unemployment in a fully-fledged DSGE framework. The model reformulates the model of Erveg, Henderson and Levin (2000) of sticky prices and wages but assumes instead that the margin of adjustment is on the extensive margin of labor (unemployment vs. employment) rather than the intensive margin (number of hours worked per employee). In this model, unemployment fluctuations are reflected in fluctuations in the wage mark-up from its desired level. Based on this framework, Gali derives a measure of the output gap based on observable variables and argues that an interest rule targeting the unemployment rate and inflation outperforms other policy rules. Jordi Galí, Frank Smets, Rafael Wouters then incorporates this reformulated labor market block in a medium scale version – read a version close to the ones used in central banks – of the workhorse New Keynesian model.

Marco Del Negro, Gauti Eggertsson, Andrea Ferrero, Nobuhiro Kiyotaki
have what a post-crisis DGSE model is now likely to look like. The model incorporates financial frictions and a liquidity trap in a DSGE model with nominal and real rigidities. This framework allows the author to ask the following questions: Can a shock to the liquidity of private paper lead to a collapse in short-term nominal interest rates and a recession like the one associated with the 2008 U.S. financial crisis? Once the nominal interest rate reaches the zero bound, what are the effects of interventions in which the government exchanges liquid government assets for illiquid private paper? Using this framework, the authors argue that the effects of liquidity intervention can be large, and show some numerical examples in which they actually prevented a repeat of the Great Depression in 2008-2009.

For those interested in learning the basics of DSGE models, David Romer has a great chapter in the new edition of his graduate textbook. The treatment is rigorous but easier than in Gali (2008), Woodford (2003) or Walsh (2010) and can therefore serve as a useful starting point in this literature. The bulk of the chapter extends the analysis of the microeconomic foundations of incomplete nominal flexibility to dynamic settings. It then presents an example of a complete DSGE model with nominal rigidity. The model is the canonical three-equation new Keynesian model of Clarida, Galí, and Gertler (2000). Unfortunately, in many ways this model is closer to the baseline real-business-cycle model than to our ultimate objective: much of the model’s appeal is tractability and elegance, not realism. The last section discusses elements of other DSGE models with monetary non-neutrality. Because of the models’ complexity and the lack of agreement about their key ingredients, however, it stops short of analyzing other fully specified models.

Calibration and Estimation

Paul Krugman think that calibration is an acceptable practice, as long as you keep your perspective. In one of his home fields, international trade, there’s fairly widespread use of “computable general equilibrium” models — fully internally consistent models about how prices and quantities fit together, with some parameters that are guesstimates based on the literature, and others that are tweaked so as to match actual trade flows in some base period. These models are then used for “what-if” analyses, especially the effects of possible changes in trade policy. The models clearly aren’t literally true, and in no sense are you testing your theory. What you’re basically doing is elaborate thought experiments that are somewhat disciplined by the data, and which you hope are more informative than just plain guesses.The point is that if you have a conceptual model of some aspect of the world, which you know is at best an approximation, it’s OK to see what that model would say if you tried to make it numerically realistic in some dimensions. It is true that a calibration exercise is informative when it fails: if there’s no way to squeeze the relevant data into your model, or the calibrated model makes predictions that you know on other grounds are ludicrous, something was gained.

Stephen Williamson
writes that part of what the calibration people were reacting to, was a view among econometricians that quantitative work was about "testing" theories. The problem is that any macroeconomic model is going to be wrong on some dimensions. To be useful, a model must be simple, and simplification makes it wrong in some sense. Subjected to standard econometric tests, it will be rejected. Models that are rejected by the data can nevertheless be extremely useful. I think that point is now widely recognized, and you won’t find strong objections to it, as you might have in 1982.

For more on this issue, we recommend this video by Chris Sims, the 2011 Nobel Prize in Economics, on calibration, statistical inference and structural change given a few ago at the INET pow wow. Mark Thoma has an easy one-page introduction to VARs for those in need of catching-up and who never really understood how these things work.

*Bruegel Economic Blogs Review is an information service that surveys external blogs. It does not survey Bruegel’s own publications, nor does it include comments by Bruegel authors.


Republishing and referencing

Bruegel considers itself a public good and takes no institutional standpoint. Anyone is free to republish and/or quote this post without prior consent. Please provide a full reference, clearly stating Bruegel and the relevant author as the source, and include a prominent hyperlink to the original post.

View comments
Read article More on this topic More by this author

Blog Post

A few good (wo)men – on the representation of women in economics

Last week, the American Economics Association Annual Meetings held a session on Gender Issues in Economics and later announced that a new code of professional conduct is in the pipeline. In this blogs review we revise the recent contributions on female representation and perception in economics.

By: Inês Goncalves Raposo Topic: Global Economics & Governance Date: January 15, 2018
Read article More on this topic More by this author

Blog Post

The Republican Tax Plan (2): The debate rumbles on

Reactions to the Republican tax plans continue, concentrating on different aspects of the proposed legislation. We review the latest contributions.

By: Silvia Merler Topic: Global Economics & Governance Date: December 18, 2017
Read article More by this author

Blog Post

The DSGE Model Quarrel (Again)

Dynamic Stochastic General Equilibrium models have come under fire since the financial crisis. A recent paper by Christiano, Eichenbaum and Trabandt – who provide a defense for DSGE – has generated yet another wave of reactions in the economic blogosphere. We review the most recent contributions on this topic.

By: Silvia Merler Topic: European Macroeconomics & Governance, Global Economics & Governance Date: December 11, 2017
Read article More by this author

Blog Post

The Bitcoin Bubble

The price of bitcoin has just passed $11,000. A year ago it was worth less than $800. Economists and commentators are thus increasingly concerned that this may be a bubble waiting to burst. We review recent opinions on the topic.

By: Silvia Merler Topic: Finance & Financial Regulation, Global Economics & Governance Date: December 4, 2017
Read article More on this topic More by this author

Blog Post

The Republican Tax Plan

As the Trump administration’s tax plan continues its way through the legislature, we review economists’ and commentators’ recent opinions on the matter.

By: Silvia Merler Topic: Global Economics & Governance Date: November 27, 2017
Read article More on this topic More by this author

Blog Post

Has the Phillips curve disappeared?

The Phillips curve prescribes a negative trade-off between inflation and unemployment. Economists have been recently debating on whether the curve has disappeared in the US and Europe. We report some of the most recent views.

By: Silvia Merler Topic: Global Economics & Governance Date: November 21, 2017
Read article More on this topic More by this author

Blog Post

Powell's Federal Reserve

With the appointment of Jerome Powell as the next Fed’s chairman, President Trump break a tradition of bipartisan re-nomination and chooses someone who is not an economy by formation. We review economist’s opinions on this choice and the challenges ahead.

By: Silvia Merler Topic: Global Economics & Governance Date: November 13, 2017
Read article More on this topic More by this author

Blog Post

The Bank of England’s dovish hike

For the first time since 2007, the Bank of England raised interest rates, with a hike of 25 basis points. At the same time, it provided forward guidance that outlines a very gradual path for future increases. We review the economic blogosphere’s reaction to this decision.

By: Silvia Merler Topic: European Macroeconomics & Governance Date: November 6, 2017
Read article More on this topic More by this author

Blog Post

The capital tax cut debate

How much do workers gain from a capital gains tax cut? CEA chairman Hasset claims the tax cut will cause average household labour income to increase by between $4000 and $9000. Several commentators note this implies that more than 100% of the incidence of the tax is on labour. This question has triggered a heated discussion in the economic blogosphere, which we review here.

By: Silvia Merler Topic: Global Economics & Governance Date: October 30, 2017
Read article More by this author

Blog Post

Bailout, bail-in and incentives

Ever since the outbreak of the global financial crisis, more and more rules have been developed to reduce the public cost of banking crises and increase the private sector’s share of the cost. We review some of the recent academic literature on bailout, bail-in and incentives.

By: Silvia Merler Topic: Finance & Financial Regulation, Global Economics & Governance Date: October 23, 2017
Read article More on this topic More by this author

Blog Post

An irrational choice: behavioural economist wins Nobel Prize

Richard Thaler was awarded this year's Nobel Prize in Economics for his contributions to the field of behavioural economics. His work documents a set of cognitive biases affecting economic decision-making and casts doubt on commonly-held assumptions about the rational ‘homo economicus’ that inhabits economic models and theories. What are the implications for the economics discipline and public policy?

By: Konstantinos Efstathiou Topic: Global Economics & Governance Date: October 16, 2017
Read article More on this topic More by this author

Blog Post

On the cost of gun ownership

On 1 October 2017, 59 people were killed and another 489 injured in what is currently the deadliest mass shooting in US modern history. The author reviews recent contributions on the economic cost of gun violence, as well as the impact of regulation.

By: Silvia Merler Topic: Global Economics & Governance Date: October 11, 2017
Load more posts