Blog Post

Blogs review: Big Data, aggregates and individuals

What’s at stake: The Big Data enthusiasts compare it to a revolution. For the agnostic observer, it is interesting to move beyond this general and somewhat speculative discussion and get a sense of what these massive quantities of information produced by and about people, things, and their interactions can and cannot do.

By: and Date: March 26, 2013 Energy & ClimateEuropean Macroeconomics & GovernanceInnovation & Competition Policy Tags & Topics

What’s at stake: The Big Data enthusiasts compare it to a revolution. For the agnostic observer, it is interesting to move beyond this general and somewhat speculative discussion and get a sense of what these massive quantities of information produced by and about people, things, and their interactions can and cannot do. In a previous review, we discussed recent applications of Big Data in the field of social science research. In this review, we consider whether and how Big Data can help complement and refine official statistics and offer ways to develop alternative measures of welfare.

Source: Manu

Big Data: revolution or evolution?

The Big Data enthusiasts call it a revolution. In 2008, Joseph Hellerstein called it the “industrial revolution of data”. Eric Brynjolsson and Andrew McAfee consider that the massive amounts of real-time data can be tracked to improve everything from websites to delivery route is the innovation story of our time. Alex (Sandy) Pentland writes that we can now use these “little data breadcrumbs” that individuals leave behind to move beyond averages and aggregates to learn more about individual behavior and networks. Ben Bernanke made a similar point in a recent speech (HT Miles Kimball) outlining that exclusive attention to aggregate numbers is likely to paint an incomplete picture of what many individuals are experiencing.

Gary King notes that Big Data is not about the data but about data analytics and paradigmatic shifts. Patrick Wolfe compares current attempts to build models of human behaviour on the basis of Big Data to the standardization of physical goods in the Industrial Revolution. Gary King says that while the data deluge does not make the scientific method obsolete, as Chris Anderson wrote in 2008, but shifts the balance between theory and empirics toward empirics in a big way.

Danah Boyd warns, however, that Big Data enhances the risk of apophenia – seeing meaningful patterns and connections where none exists. Kate Crawford cautions that relying on Big Data alone without recognizing hidden biases may provide a distorted picture while not changing the fundamental difference between correlation and causation. Danah Boyd and Kate Crawford as well as Martin Hilberg make an interesting point about the emergence of a new digital divide along data analytics capacities.

Big Data and official statistics

Big Data is relevant to the production, relevance and reliability of key official statistics such as GDP and inflation.

Michael Horrigan, the head of the Bureau of Labor Statistics (BLS) Office of Prices and Living Conditions, provides a definition that helps clarify where the border lies between Big Data and traditional data. Big Data are non-sampled data that are characterized by the creation of databases from electronic sources whose primary purpose is something other than statistical inference

Source: Michael Horrigan

Big Data are already increasingly used to produce or complement official statistics in several advanced economies. Michael W. Horrigan, the head of the Bureau of Labor Statistics (BLS) Office of Prices and Living Conditions, describes how the BLS is already heavily relying on a large array of non-official data including corporate data to refine official economic statistics. These innovations draw on insights and web-scrapping techniques from the Billion Prices Project to track inflation in real-time, the ‘nowcasting’ research techniques developed by Hal Varian on the basis of Google searches, and the research from Matthew Shapiro that use data from Twitter accounts in a model to predict the level of initial claims for unemployment insurance. The Financial Times also reports how Maurizio Luisi and Alessandro Beber are using Big Data techniques—news mining and sentiment analysis—to build a “real-time growth index” to assess the relative strength of economic news in context and to give a snapshot of growth across leading economies.

The need for more timely and sensitive measures of economic activity in an increasingly volatile economic environment was made especially clear during the Great Recession. A recent IMF paper (Quarterly GDP Revisions in G-20 Countries: Evidence from the 2008 Financial Crisis) discussed by Menzie Chinn documents how compilation systems of quarterly GDP in most advanced countries are well designed to keep track of an economy on a steady path of growth, but are less suited for fully picking up large swings in economic activity in real-time. Early estimates of quarterly GDP become less reliable with increased economic volatility, typically requiring ex post downward post revisions during recessions and upward revisions during expansions. Brent Moulton and Dennis Fixler of the BEA note that the total revision of 2008Q4 US GDP from advance to latest is the largest downward GDP revision on record.

Source: BEA

Marcelo Guigale writes that similar innovations could be observed in developing countries where official statistics can be several years old. Pointing to the fact that Africans should not have to wait for a decade after a survey is fielded to adjust their country’s GDP or estimate poverty numbers, Wolgang Fengler argues that poor countries can leapfrog in the process with estimates built on the basis of Big Data. This may contribute to fixing what World Bank Chief Economist for Africa Shanta Devarajan refers to as the continent’s “statistical tragedy”. In Indonesia, joint research by the UN Global Pulse team and Crimson Hexagon found that conversation about rice on Twitter could help monitor food prices with surprising accuracy.

But the national and international statistical communities recognize the many challenges ahead, discussed during several recent events at the World Bank, the OECD in October and January, and recently during the 44th session of the United Nations Statistical Commission. A recent paper commissioned by the United Nations Economic Commissions in Europe encourages National Statistical Offices to develop internal Big Data analytical capability through specialized training. Reflecting these growing concerns among national statisticians, the 2013 International Statistical Institute’s World Statistics Congress to be held in Hong Kong in August will include a special session on Big Data and Official Statistics.

Michael Horrigan discusses the future of using Big Data by the U.S. statistical system and notes that integrating private sources of data such as Google, Intuit and Billion Prices is unlikely without more transparency from these providers. Progressing towards a better comparability of results from big data with official statistics seems more likely. Andrew Wyckoff reckons that new roles that National Statistical Offices may play in the Big Data age may include acting as a trusted certification 3rd party and issuer of best practices for data mining.

Big Data and alternative measures of welfare

Big Data may also offer ways to go beyond traditional economic statistics. Alternative measures of welfare have been used for decades, including the Human Development Index since the early 1990s, and the Kingdom of Bhutan Gross National Happiness since 1972, as well as the more recent OECD Better Life Index and Happy Planet Index. Big Data has revived some of these debates by providing new data sources and analytics tools.

Ben Bernanke said in a recent speech that we should seek better and more direct measurements of economic well-being, the ultimate objective of our policy decisions. Miles Kimball notes that these concerns echo the position of British Prime Minister David Cameron who said policymakers should focused not just on GDP but on general wellbeing, as well as former French President Nicolas Sarkozy when he pledged to fight to make all international organizations change their statistical systems by following the recommendations of the Stiglitz commission on the Measurement of Economic Performance and Social Progress.

On the recent occasion of the United Nations International Happiness Day, John Havens, founder of the H(app)athon Project, described the project’s main aim as giving “big data a direction” drawing in part on the insights of the Quantified Self movement co-founded by Wired’s founder Kevin Kelly that relies on self-monitoring and personal data collection and use. Esther Dyson predicts the emergence of a similar “Quantified Community movement”, with communities relying on their own data to measure and improve the state, health, and activities of their people and institutions.

Miles Kimball wonders how we can avoid the dangers of manipulation and politicization of new indicators of well-being and suggests that if we are going to use objective and subjective measures of well-being such as happiness and life satisfaction alongside well-seasoned measures such as GDP as ways to assess how well a nation is doing, we need to proceed in a careful, scientific way that can stand the test of time.

Data sharing and privacy

Since the bulk of Big Data is held by corporations, Robert Kirkpatrick and Patrick Meier have been promoting the concept of “data philanthropy” within the context of development and humanitarian response. Data philanthropy involves companies sharing proprietary datasets for social good. Michael Horrigan argues that it is likely that greater progress will be made using big data from businesses than households. Hal Varian discusses incentives for firms to provide data:

  • Profit motive (Mastercard, Visa)


  • Brand identity, thought leadership (Intuit, Monster, Zillow, Google)

  • Financial reporting to investors (FedEx, UPS, retail)

The World Economic Forum proposes to provide greater individual control over data usage. The New York Times reports that Alex Pentland, an adviser to the WEF, proposes “a new deal on data” that would entail the rights to possess your data, to control how it is used, and to destroy or distribute it as one sees fit.

Felix Gillette writes in Bloomberg that at the moment, the default setting for almost everything people share online is that it will live for eternity in the cloud. In Delete: The Virtue of Forgetting in the Digital Age, Viktor Mayer-Schönberger, a lawyer and a professor at the University of Oxford, argues that this inevitably creates problems for individuals and societies that need the ability to forget in order to move forward. A perfect memory, he writes, can be paralyzing, trapping people in the past and discouraging them from trying new challenges.

Viktor Mayer-Schönberger argues that all information created online should come with customizable expiration dates. Not every piece of data would have to expire after a few seconds as photos on Snapchat do. The key, says Mayer-Schönberger, is to include some form of a self-destruct button in everything created online and to give consumers the power to tinker with the settings from the outset."


Republishing and referencing

Bruegel considers itself a public good and takes no institutional standpoint. Anyone is free to republish and/or quote this post without prior consent. Please provide a full reference, clearly stating Bruegel and the relevant author as the source, and include a prominent hyperlink to the original post.

View comments
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

The Fed’s rethinking of normality

What’s at stake: As we approach Jackson Hole, monetary policymakers are considering how to redesign monetary policy strategies to better cope with a low r-star environment.

By: Jérémie Cohen-Setton Topic: Global Economics & Governance Date: August 22, 2016
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

The state of macro redux

What’s at stake: In 2008, Olivier Blanchard argued in a paper called “the state of macro” that a largely shared vision of fluctuations and of methodology had emerged. With the financial crisis and our inability to prevent the greatest recession since the 1930s, the discipline entered into a period of soul searching. The discussions on the state of macro received new echoes this week after Blanchard published a short essay on the future of DSGE models.

By: Jérémie Cohen-Setton Topic: European Macroeconomics & Governance Date: August 16, 2016
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

Racial prejudice in police use of force

What’s at stake: This week was dominated by a new study by Roland Fryer exploring racial differences in police use of force. His counterintuitive result that blacks and Hispanics experience discrimination for all types of interaction involving force except for officer involved shootings provoked debate after the study was published on Monday.

By: Jérémie Cohen-Setton Topic: Global Economics & Governance Date: July 18, 2016
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

The great risk shift and populism

What’s at stake: For many commentators, Brexit was the signal of a broad populist backlash and illustrated the need to articulate policies that address the grievances of those citizens who have been left behind by recent economic changes.

By: Jérémie Cohen-Setton Topic: European Macroeconomics & Governance Date: July 11, 2016
Read article More on this topic

Blog Post

Alvaro Leandro
jaume

Spanish unemployment and the effects of the 2012 labour market reform

What’s at stake: Spain is currently the EU country with the second highest level of unemployment, after Greece. The high and persistent level of unemployment and the appropriate labour market reforms are a major topic of discussion in Spain. We review arguments made in the blogosphere and by international organisations on the reasons for Spain’s stubbornly high unemployment, and various assessments of the labour market reforms of 2012.

By: Alvaro Leandro and Jaume Martí Romero Topic: European Macroeconomics & Governance Date: July 4, 2016
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

The breakdown of productivity diffusion

The OECD has been pushing the idea that the productivity slowdown is not so much due to a lack of innovation but rather due to a lack of innovation diffusion between firms.

By: Jérémie Cohen-Setton Topic: Innovation & Competition Policy Date: June 27, 2016
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

The cyclicality of structural reforms

What’s at stake: In line with the crisis-induced reform hypothesis, European countries have since 2010 enacted unpopular reforms in labour market regulation and social welfare systems.

By: Jérémie Cohen-Setton Topic: European Macroeconomics & Governance Date: June 13, 2016
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

The new Washington Consensus

What’s at stake: Since 2008 the IMF has been at the forefront of a revaluation of the orthodox policy toolbox. While the majority of policies that constituted the old Washington Consensus remain in place, the consensus has moved on financial openness and fiscal consolidations.

By: Jérémie Cohen-Setton Topic: Global Economics & Governance Date: June 3, 2016
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

The abandonment of counter-cyclical fiscal policy

What’s at stake: The reluctance to use fiscal policy as a stabilizing tool in the current deflationary environment has been puzzling to many and a number of authors are now putting forward possible explanations.

By: Jérémie Cohen-Setton Topic: Global Economics & Governance Date: May 30, 2016
Read article More on this topic

Blog Post

Uuriintuya Batsaikhan
Pia Hüttl

The benefits and drawbacks of TTIP

What’s at stake: Since the recent leak of documents on TTIP (Transatlantic Trade and Investment Partnership) negotiations, there has been renewed interest in the trade deal. This blog review looks at studies on the estimated impacts of TTIP on growth, labour markets and social conditions. We also summarise its impact on countries outside the deal, and look at the debates surrounding its counterpart, the TPP (Trans-Pacific Partnership).

By: Uuriintuya Batsaikhan and Pia Hüttl Topic: Global Economics & Governance Date: May 23, 2016
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

Regulation and growth

What’s at stake: A heated debate took place this week on the blogosphere on the link between regulation and growth following an op-ed by John Cochrane claiming the US economy could be five times richer if regulations were scrapped.

By: Jérémie Cohen-Setton Topic: Global Economics & Governance Date: May 16, 2016
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

The economics of crime and punishment

What’s at stake: The Senate announced this week revisions to a sentencing reform bill – the Sentencing Reform and Corrections Act – that would lower mandatory minimums for some low-level drug crimes.

By: Jérémie Cohen-Setton Topic: Global Economics & Governance Date: May 2, 2016
Load more posts