Blog post

The puzzle of technical dis-employment and productivity slowdown

What’s at stake: Larry Summers made an important speech a few weeks ago at a Peterson Institute conference on the productivity slowdown arguing it is

Publishing date
07 December 2015

A paradox

Larry Summers writes that it’s really hard to square the view that the “new economy is producing substantial dis-employment” with the view that “productivity growth is slowing”. The fact that you move through an airport with much less contact with ticket takers, the fact that you can carry on all kinds of transactions with your cellphone, the fact that you can check out of an increasing number of stores without human contact, the fact that robots are increasingly present in manufacturing, make a fairly compelling case that the increasing dis-employment we see is related to technical change. And yet, if technical change is a major source of dis-employment, it is hard to see how it could be a major source of dis-employment without also being a major source of productivity improvement.

Adam Posen writes that it is a real puzzle to observe simultaneously multi-year trends of rising non-employment of low-skilled workers and declining measured productivity growth. Either we need a new understanding, or one of these observed patterns is ill founded or misleading. In my view, we should trust labor market data more than GDP data when they come in conflict—workers are employed and paid, and pay taxes (usually) so they get directly counted, whereas much of GDP data is constructed. Thus productivity, as the residual of GDP minus capital and labor accumulation, is much less reliable than the directly observed count of workers.

The accelerating mismeasurement hypothesis

Larry Summers writes that it is at least possible that there are substantial mismeasurement aspects and that there is a reasonable prospect at accelerating mismeasurement as an explanation for some part of this puzzle. As we move from tangible manufactured goods to intangible services, it seems to me plausible that the fraction of the economy where we’re doing really badly on quality is likely to be increasing and that means that mismeasurement is increasing.

Larry Summers writes that it’s almost impossible to disagree with the view that the price indices overstate inflation and therefore the quantity indices understate quantitative growth. It is significantly more contestable whether that process has accelerated or not, but I don’t find myself with an alternative way of thinking about the dis-employment through technology which seems to be a pervasive phenomenon and that leads me to assign more weight to the mismeasurement hypotheses than I otherwise would. The other explanations could be that there really isn’t a set of dis-employment events or that there is a production function that can generate higher productivity together with slower productivity and more dis-employment.

Jan Hatzius and Kris Dawsey write that increased measurement error might indeed account for most of the ¾pp decline seen in consensus estimates of trend productivity growth over the past decade. The shift in the IT sector’s center of gravity from general-purpose IT hardware to specialized IT hardware, software and digital content—where it is far harder to measure quality-adjusted prices and real output—may have resulted in a growing statistical understatement of the technology contribution to growth.

  • The sharp slowdown in the measured deflation of semiconductors and computers may be a spurious consequence of shifts in industry dynamics rather than a genuine slowdown in technological progress. The lack of any significant quality-adjusted price decline in more specialized IT hardware industries also looks implausible. Our best guess is that these two issues together are worth about 0.2pp per year on real GDP growth.
  • The problems in the software and digital content industry are thornier but potentially more sizable. One is that the official statistics seem to make little attempt to adjust for quality improvements in these products over time. Another is that the proliferation of free digital content has arguably introduced a more extreme form of “new product bias” into the inflation statistics. These two issues might be worth another 0.5pp per year.

BEBR_5_12_15_1

John Fernald writes that David Byrne at the Board of Governors plus various co-authors of his have done a lot of work on that and suggest that, indeed, this is important and it probably adds up to one-tenth or two-tenths of GDP. So, that on its own will not change the productivity numbers. Remember, the magnitude of what we need to explain is a slowdown after 2003 or 2004 of two percentage points on labor productivity growth and maybe 1.25 or 1.5% on TFP. So, one- or two-tenths won’t close that gap.

About the authors

  • Jérémie Cohen-Setton

    Jérémie Cohen-Setton is a Research Fellow at the Peterson Institute for International Economics. Jérémie received his PhD in Economics from U.C. Berkeley and worked previously with Goldman Sachs Global Economic Research, HM Treasury, and Bruegel. At Bruegel, he was Research Assistant to Director Jean Pisani-Ferry and President Mario Monti. He also shaped and developed the Bruegel Economic Blogs Review.

Related content

Blog post

The fiscal stance puzzle

What’s at stake: In a low r-star environment, fiscal policy should be accommodative at the global level. Instead, even in countries with current accou

Jérémie Cohen-Setton
Blog post

The state of macro redux

What’s at stake: In 2008, Olivier Blanchard argued in a paper called “the state of macro” that a largely shared vision of fluctuations and of methodol

Jérémie Cohen-Setton