Blog Post

Blogs review: Big Data, aggregates and individuals

What’s at stake: The Big Data enthusiasts compare it to a revolution. For the agnostic observer, it is interesting to move beyond this general and somewhat speculative discussion and get a sense of what these massive quantities of information produced by and about people, things, and their interactions can and cannot do.

By: and Date: March 26, 2013

What’s at stake: The Big Data enthusiasts compare it to a revolution. For the agnostic observer, it is interesting to move beyond this general and somewhat speculative discussion and get a sense of what these massive quantities of information produced by and about people, things, and their interactions can and cannot do. In a previous review, we discussed recent applications of Big Data in the field of social science research. In this review, we consider whether and how Big Data can help complement and refine official statistics and offer ways to develop alternative measures of welfare.

Source: Manu

Big Data: revolution or evolution?

The Big Data enthusiasts call it a revolution. In 2008, Joseph Hellerstein called it the “industrial revolution of data”. Eric Brynjolsson and Andrew McAfee consider that the massive amounts of real-time data can be tracked to improve everything from websites to delivery route is the innovation story of our time. Alex (Sandy) Pentland writes that we can now use these “little data breadcrumbs” that individuals leave behind to move beyond averages and aggregates to learn more about individual behavior and networks. Ben Bernanke made a similar point in a recent speech (HT Miles Kimball) outlining that exclusive attention to aggregate numbers is likely to paint an incomplete picture of what many individuals are experiencing.

Gary King notes that Big Data is not about the data but about data analytics and paradigmatic shifts. Patrick Wolfe compares current attempts to build models of human behaviour on the basis of Big Data to the standardization of physical goods in the Industrial Revolution. Gary King says that while the data deluge does not make the scientific method obsolete, as Chris Anderson wrote in 2008, but shifts the balance between theory and empirics toward empirics in a big way.

Danah Boyd warns, however, that Big Data enhances the risk of apophenia – seeing meaningful patterns and connections where none exists. Kate Crawford cautions that relying on Big Data alone without recognizing hidden biases may provide a distorted picture while not changing the fundamental difference between correlation and causation. Danah Boyd and Kate Crawford as well as Martin Hilberg make an interesting point about the emergence of a new digital divide along data analytics capacities.

Big Data and official statistics

Big Data is relevant to the production, relevance and reliability of key official statistics such as GDP and inflation.

Michael Horrigan, the head of the Bureau of Labor Statistics (BLS) Office of Prices and Living Conditions, provides a definition that helps clarify where the border lies between Big Data and traditional data. Big Data are non-sampled data that are characterized by the creation of databases from electronic sources whose primary purpose is something other than statistical inference

Source: Michael Horrigan

Big Data are already increasingly used to produce or complement official statistics in several advanced economies. Michael W. Horrigan, the head of the Bureau of Labor Statistics (BLS) Office of Prices and Living Conditions, describes how the BLS is already heavily relying on a large array of non-official data including corporate data to refine official economic statistics. These innovations draw on insights and web-scrapping techniques from the Billion Prices Project to track inflation in real-time, the ‘nowcasting’ research techniques developed by Hal Varian on the basis of Google searches, and the research from Matthew Shapiro that use data from Twitter accounts in a model to predict the level of initial claims for unemployment insurance. The Financial Times also reports how Maurizio Luisi and Alessandro Beber are using Big Data techniques—news mining and sentiment analysis—to build a “real-time growth index” to assess the relative strength of economic news in context and to give a snapshot of growth across leading economies.

The need for more timely and sensitive measures of economic activity in an increasingly volatile economic environment was made especially clear during the Great Recession. A recent IMF paper (Quarterly GDP Revisions in G-20 Countries: Evidence from the 2008 Financial Crisis) discussed by Menzie Chinn documents how compilation systems of quarterly GDP in most advanced countries are well designed to keep track of an economy on a steady path of growth, but are less suited for fully picking up large swings in economic activity in real-time. Early estimates of quarterly GDP become less reliable with increased economic volatility, typically requiring ex post downward post revisions during recessions and upward revisions during expansions. Brent Moulton and Dennis Fixler of the BEA note that the total revision of 2008Q4 US GDP from advance to latest is the largest downward GDP revision on record.

Source: BEA

Marcelo Guigale writes that similar innovations could be observed in developing countries where official statistics can be several years old. Pointing to the fact that Africans should not have to wait for a decade after a survey is fielded to adjust their country’s GDP or estimate poverty numbers, Wolgang Fengler argues that poor countries can leapfrog in the process with estimates built on the basis of Big Data. This may contribute to fixing what World Bank Chief Economist for Africa Shanta Devarajan refers to as the continent’s “statistical tragedy”. In Indonesia, joint research by the UN Global Pulse team and Crimson Hexagon found that conversation about rice on Twitter could help monitor food prices with surprising accuracy.

But the national and international statistical communities recognize the many challenges ahead, discussed during several recent events at the World Bank, the OECD in October and January, and recently during the 44th session of the United Nations Statistical Commission. A recent paper commissioned by the United Nations Economic Commissions in Europe encourages National Statistical Offices to develop internal Big Data analytical capability through specialized training. Reflecting these growing concerns among national statisticians, the 2013 International Statistical Institute’s World Statistics Congress to be held in Hong Kong in August will include a special session on Big Data and Official Statistics.

Michael Horrigan discusses the future of using Big Data by the U.S. statistical system and notes that integrating private sources of data such as Google, Intuit and Billion Prices is unlikely without more transparency from these providers. Progressing towards a better comparability of results from big data with official statistics seems more likely. Andrew Wyckoff reckons that new roles that National Statistical Offices may play in the Big Data age may include acting as a trusted certification 3rd party and issuer of best practices for data mining.

Big Data and alternative measures of welfare

Big Data may also offer ways to go beyond traditional economic statistics. Alternative measures of welfare have been used for decades, including the Human Development Index since the early 1990s, and the Kingdom of Bhutan Gross National Happiness since 1972, as well as the more recent OECD Better Life Index and Happy Planet Index. Big Data has revived some of these debates by providing new data sources and analytics tools.

Ben Bernanke said in a recent speech that we should seek better and more direct measurements of economic well-being, the ultimate objective of our policy decisions. Miles Kimball notes that these concerns echo the position of British Prime Minister David Cameron who said policymakers should focused not just on GDP but on general wellbeing, as well as former French President Nicolas Sarkozy when he pledged to fight to make all international organizations change their statistical systems by following the recommendations of the Stiglitz commission on the Measurement of Economic Performance and Social Progress.

On the recent occasion of the United Nations International Happiness Day, John Havens, founder of the H(app)athon Project, described the project’s main aim as giving “big data a direction” drawing in part on the insights of the Quantified Self movement co-founded by Wired’s founder Kevin Kelly that relies on self-monitoring and personal data collection and use. Esther Dyson predicts the emergence of a similar “Quantified Community movement”, with communities relying on their own data to measure and improve the state, health, and activities of their people and institutions.

Miles Kimball wonders how we can avoid the dangers of manipulation and politicization of new indicators of well-being and suggests that if we are going to use objective and subjective measures of well-being such as happiness and life satisfaction alongside well-seasoned measures such as GDP as ways to assess how well a nation is doing, we need to proceed in a careful, scientific way that can stand the test of time.

Data sharing and privacy

Since the bulk of Big Data is held by corporations, Robert Kirkpatrick and Patrick Meier have been promoting the concept of “data philanthropy” within the context of development and humanitarian response. Data philanthropy involves companies sharing proprietary datasets for social good. Michael Horrigan argues that it is likely that greater progress will be made using big data from businesses than households. Hal Varian discusses incentives for firms to provide data:

  • Profit motive (Mastercard, Visa)


  • Brand identity, thought leadership (Intuit, Monster, Zillow, Google)

  • Financial reporting to investors (FedEx, UPS, retail)

The World Economic Forum proposes to provide greater individual control over data usage. The New York Times reports that Alex Pentland, an adviser to the WEF, proposes “a new deal on data” that would entail the rights to possess your data, to control how it is used, and to destroy or distribute it as one sees fit.

Felix Gillette writes in Bloomberg that at the moment, the default setting for almost everything people share online is that it will live for eternity in the cloud. In Delete: The Virtue of Forgetting in the Digital Age, Viktor Mayer-Schönberger, a lawyer and a professor at the University of Oxford, argues that this inevitably creates problems for individuals and societies that need the ability to forget in order to move forward. A perfect memory, he writes, can be paralyzing, trapping people in the past and discouraging them from trying new challenges.

Viktor Mayer-Schönberger argues that all information created online should come with customizable expiration dates. Not every piece of data would have to expire after a few seconds as photos on Snapchat do. The key, says Mayer-Schönberger, is to include some form of a self-destruct button in everything created online and to give consumers the power to tinker with the settings from the outset."


Republishing and referencing

Bruegel considers itself a public good and takes no institutional standpoint. Anyone is free to republish and/or quote this post without prior consent. Please provide a full reference, clearly stating Bruegel and the relevant author as the source, and include a prominent hyperlink to the original post.

Topics

Tags

Comments

Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

Understanding HM Treasury’s Brexit analysis

What’s at stake: The UK will hold a referendum on its membership of the EU on June 23rd 2016. Her Majesty’s Treasury released an assessment of the impact of Brexit finding that the economy would be between 3 and 7% smaller in 2030 if the UK left the EU than it would be if it stayed in.

By: Jérémie Cohen-Setton Topic: European Macroeconomics & Governance Date: April 25, 2016
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

Trade deficits and jobs at the ZLB

What’s at stake: In the populist narrative against globalization, trade deficits are seen as costing jobs. While this mercantilist view of the world is hard to square in normal times, a number of authors have suggested that the intellectual basis for that view is stronger in a liquidity trap.

By: Jérémie Cohen-Setton Topic: Global Economics & Governance Date: April 4, 2016
Read article More by this author

Blog Post

Jérémie Cohen-Setton

The procyclicality of TFP growth

What’s at stake: The argument that total factor productivity (TFP) is procyclical has been getting a lot of airtime over the past few weeks as it was central to understanding the recent controversy over the economic impact of Sanders. But it also speaks to the question of the current TFP slowdown and to the issue of a clean separation between cycles and trends.

By: Jérémie Cohen-Setton Topic: European Macroeconomics & Governance, Global Economics & Governance Date: March 29, 2016
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

The trade-backlash explanation of Trump & Sanders

What’s at stake: The success of presidential candidates Donald Trump and Bernie Sanders has had bloggers wondered whether the backlash against globalization is eventually getting political traction.

By: Jérémie Cohen-Setton Topic: Global Economics & Governance Date: March 21, 2016
Read article More on this topic

Blog Post

Pia Hüttl
Alvaro Leandro

Helicopter drops reloaded

What’s at stake: Central banks have recently been scaling up their unconventional monetary policy measures. Discussions about helicopter money seem to be getting ever louder. We review the theoretical discussions, the effectiveness of tax-rebates and legal and political complications

By: Pia Hüttl and Alvaro Leandro Topic: European Macroeconomics & Governance Date: March 14, 2016
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

The elimination of high denomination notes

What’s at stake: As high-denomination notes (HDNs) make it easier to transact crime, finance terrorism, and evade taxes, a number of commentators have called for their elimination.

By: Jérémie Cohen-Setton Topic: European Macroeconomics & Governance Date: March 7, 2016
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

The Sanders controversy

What’s at stake: A recent study claiming that Sanders policies would produce 5.3 percent growth a year over the next decade has been at the center of this week’s discussions in the blogosphere.

By: Jérémie Cohen-Setton Topic: Global Economics & Governance Date: February 29, 2016
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

The impotency of central banks

What’s at stake: The negative market reaction to the latest efforts to provide further monetary stimulus has generated an important discussion on whether central banks have lost credibility in their abilities to fight downside risks and shore up economies.

By: Jérémie Cohen-Setton Topic: Global Economics & Governance Date: February 22, 2016
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

Blaming the Fed for the Great Recession

What’s at stake: Following an article in the New York Times by David Beckworth and Ramesh Ponnuru, the conversation on the blogosphere was dominated this week by the question of whether the Fed actually caused the Great Recession. While not mainstream, this narrative recently received a boost as Ted Cruz, a Republican candidate for the White House, championed it.

By: Jérémie Cohen-Setton Topic: European Macroeconomics & Governance Date: February 1, 2016
Read article More on this topic

Blog Post

IMG_20151009_103117 (3)
Pia Hüttl

Lost in assumptions: assessing the economic impact of migrants

What’s at stake: Many research institutes have estimated the economic impact of migrants, in particular regarding fiscal budgets and the labour market. These studies often give contradictory results. This blogs review looks at the different assumptions and approaches behind these results.

By: Nuria Boot and Pia Hüttl Topic: European Macroeconomics & Governance Date: January 18, 2016
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

The use of models by policymakers

What’s at stake: The latest discussions on the blogosphere have been dominated by a back and forth trialogue between Larry Summers, Paul Krugman and Brad DeLong on the appropriate use of models as policy guides. While they all agree that the Fed’s decision to raise rates was a mistake, they disagree on the intellectual reasons behind it.

By: Jérémie Cohen-Setton Topic: European Macroeconomics & Governance Date: January 11, 2016
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

Finland and asymmetric shocks

What’s at stake: Finland exemplifies the difficulty of dealing with asymmetric shocks within a Monetary Union as the Finnish economy has struggled to recover from a series of idiosyncratic shocks – the decline of Nokia, the obsolescence of the timber industry, and the fallout of the Russian crisis.

By: Jérémie Cohen-Setton Topic: European Macroeconomics & Governance Date: December 21, 2015
Load more posts