Blog Post

Blogs review: Big Data, aggregates and individuals

What’s at stake: The Big Data enthusiasts compare it to a revolution. For the agnostic observer, it is interesting to move beyond this general and somewhat speculative discussion and get a sense of what these massive quantities of information produced by and about people, things, and their interactions can and cannot do.

By: and Date: March 26, 2013

What’s at stake: The Big Data enthusiasts compare it to a revolution. For the agnostic observer, it is interesting to move beyond this general and somewhat speculative discussion and get a sense of what these massive quantities of information produced by and about people, things, and their interactions can and cannot do. In a previous review, we discussed recent applications of Big Data in the field of social science research. In this review, we consider whether and how Big Data can help complement and refine official statistics and offer ways to develop alternative measures of welfare.

Source: Manu

Big Data: revolution or evolution?

The Big Data enthusiasts call it a revolution. In 2008, Joseph Hellerstein called it the “industrial revolution of data”. Eric Brynjolsson and Andrew McAfee consider that the massive amounts of real-time data can be tracked to improve everything from websites to delivery route is the innovation story of our time. Alex (Sandy) Pentland writes that we can now use these “little data breadcrumbs” that individuals leave behind to move beyond averages and aggregates to learn more about individual behavior and networks. Ben Bernanke made a similar point in a recent speech (HT Miles Kimball) outlining that exclusive attention to aggregate numbers is likely to paint an incomplete picture of what many individuals are experiencing.

Gary King notes that Big Data is not about the data but about data analytics and paradigmatic shifts. Patrick Wolfe compares current attempts to build models of human behaviour on the basis of Big Data to the standardization of physical goods in the Industrial Revolution. Gary King says that while the data deluge does not make the scientific method obsolete, as Chris Anderson wrote in 2008, but shifts the balance between theory and empirics toward empirics in a big way.

Danah Boyd warns, however, that Big Data enhances the risk of apophenia – seeing meaningful patterns and connections where none exists. Kate Crawford cautions that relying on Big Data alone without recognizing hidden biases may provide a distorted picture while not changing the fundamental difference between correlation and causation. Danah Boyd and Kate Crawford as well as Martin Hilberg make an interesting point about the emergence of a new digital divide along data analytics capacities.

Big Data and official statistics

Big Data is relevant to the production, relevance and reliability of key official statistics such as GDP and inflation.

Michael Horrigan, the head of the Bureau of Labor Statistics (BLS) Office of Prices and Living Conditions, provides a definition that helps clarify where the border lies between Big Data and traditional data. Big Data are non-sampled data that are characterized by the creation of databases from electronic sources whose primary purpose is something other than statistical inference

Source: Michael Horrigan

Big Data are already increasingly used to produce or complement official statistics in several advanced economies. Michael W. Horrigan, the head of the Bureau of Labor Statistics (BLS) Office of Prices and Living Conditions, describes how the BLS is already heavily relying on a large array of non-official data including corporate data to refine official economic statistics. These innovations draw on insights and web-scrapping techniques from the Billion Prices Project to track inflation in real-time, the ‘nowcasting’ research techniques developed by Hal Varian on the basis of Google searches, and the research from Matthew Shapiro that use data from Twitter accounts in a model to predict the level of initial claims for unemployment insurance. The Financial Times also reports how Maurizio Luisi and Alessandro Beber are using Big Data techniques—news mining and sentiment analysis—to build a “real-time growth index” to assess the relative strength of economic news in context and to give a snapshot of growth across leading economies.

The need for more timely and sensitive measures of economic activity in an increasingly volatile economic environment was made especially clear during the Great Recession. A recent IMF paper (Quarterly GDP Revisions in G-20 Countries: Evidence from the 2008 Financial Crisis) discussed by Menzie Chinn documents how compilation systems of quarterly GDP in most advanced countries are well designed to keep track of an economy on a steady path of growth, but are less suited for fully picking up large swings in economic activity in real-time. Early estimates of quarterly GDP become less reliable with increased economic volatility, typically requiring ex post downward post revisions during recessions and upward revisions during expansions. Brent Moulton and Dennis Fixler of the BEA note that the total revision of 2008Q4 US GDP from advance to latest is the largest downward GDP revision on record.

Source: BEA

Marcelo Guigale writes that similar innovations could be observed in developing countries where official statistics can be several years old. Pointing to the fact that Africans should not have to wait for a decade after a survey is fielded to adjust their country’s GDP or estimate poverty numbers, Wolgang Fengler argues that poor countries can leapfrog in the process with estimates built on the basis of Big Data. This may contribute to fixing what World Bank Chief Economist for Africa Shanta Devarajan refers to as the continent’s “statistical tragedy”. In Indonesia, joint research by the UN Global Pulse team and Crimson Hexagon found that conversation about rice on Twitter could help monitor food prices with surprising accuracy.

But the national and international statistical communities recognize the many challenges ahead, discussed during several recent events at the World Bank, the OECD in October and January, and recently during the 44th session of the United Nations Statistical Commission. A recent paper commissioned by the United Nations Economic Commissions in Europe encourages National Statistical Offices to develop internal Big Data analytical capability through specialized training. Reflecting these growing concerns among national statisticians, the 2013 International Statistical Institute’s World Statistics Congress to be held in Hong Kong in August will include a special session on Big Data and Official Statistics.

Michael Horrigan discusses the future of using Big Data by the U.S. statistical system and notes that integrating private sources of data such as Google, Intuit and Billion Prices is unlikely without more transparency from these providers. Progressing towards a better comparability of results from big data with official statistics seems more likely. Andrew Wyckoff reckons that new roles that National Statistical Offices may play in the Big Data age may include acting as a trusted certification 3rd party and issuer of best practices for data mining.

Big Data and alternative measures of welfare

Big Data may also offer ways to go beyond traditional economic statistics. Alternative measures of welfare have been used for decades, including the Human Development Index since the early 1990s, and the Kingdom of Bhutan Gross National Happiness since 1972, as well as the more recent OECD Better Life Index and Happy Planet Index. Big Data has revived some of these debates by providing new data sources and analytics tools.

Ben Bernanke said in a recent speech that we should seek better and more direct measurements of economic well-being, the ultimate objective of our policy decisions. Miles Kimball notes that these concerns echo the position of British Prime Minister David Cameron who said policymakers should focused not just on GDP but on general wellbeing, as well as former French President Nicolas Sarkozy when he pledged to fight to make all international organizations change their statistical systems by following the recommendations of the Stiglitz commission on the Measurement of Economic Performance and Social Progress.

On the recent occasion of the United Nations International Happiness Day, John Havens, founder of the H(app)athon Project, described the project’s main aim as giving “big data a direction” drawing in part on the insights of the Quantified Self movement co-founded by Wired’s founder Kevin Kelly that relies on self-monitoring and personal data collection and use. Esther Dyson predicts the emergence of a similar “Quantified Community movement”, with communities relying on their own data to measure and improve the state, health, and activities of their people and institutions.

Miles Kimball wonders how we can avoid the dangers of manipulation and politicization of new indicators of well-being and suggests that if we are going to use objective and subjective measures of well-being such as happiness and life satisfaction alongside well-seasoned measures such as GDP as ways to assess how well a nation is doing, we need to proceed in a careful, scientific way that can stand the test of time.

Data sharing and privacy

Since the bulk of Big Data is held by corporations, Robert Kirkpatrick and Patrick Meier have been promoting the concept of “data philanthropy” within the context of development and humanitarian response. Data philanthropy involves companies sharing proprietary datasets for social good. Michael Horrigan argues that it is likely that greater progress will be made using big data from businesses than households. Hal Varian discusses incentives for firms to provide data:

  • Profit motive (Mastercard, Visa)


  • Brand identity, thought leadership (Intuit, Monster, Zillow, Google)

  • Financial reporting to investors (FedEx, UPS, retail)

The World Economic Forum proposes to provide greater individual control over data usage. The New York Times reports that Alex Pentland, an adviser to the WEF, proposes “a new deal on data” that would entail the rights to possess your data, to control how it is used, and to destroy or distribute it as one sees fit.

Felix Gillette writes in Bloomberg that at the moment, the default setting for almost everything people share online is that it will live for eternity in the cloud. In Delete: The Virtue of Forgetting in the Digital Age, Viktor Mayer-Schönberger, a lawyer and a professor at the University of Oxford, argues that this inevitably creates problems for individuals and societies that need the ability to forget in order to move forward. A perfect memory, he writes, can be paralyzing, trapping people in the past and discouraging them from trying new challenges.

Viktor Mayer-Schönberger argues that all information created online should come with customizable expiration dates. Not every piece of data would have to expire after a few seconds as photos on Snapchat do. The key, says Mayer-Schönberger, is to include some form of a self-destruct button in everything created online and to give consumers the power to tinker with the settings from the outset."


Republishing and referencing

Bruegel considers itself a public good and takes no institutional standpoint. Anyone is free to republish and/or quote this post without prior consent. Please provide a full reference, clearly stating Bruegel and the relevant author as the source, and include a prominent hyperlink to the original post.

Topics

Tags

Comments

Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

The history of the macroeconomic divide

What’s at stake: Following up on his mathiness critique that economic theory is becoming a sloppy mixture of words and symbols, Paul Romer wrote a series of blog posts over the past few weeks discussing how things went so far off in the macroeconomic field, where a group (often referred as fresh-water economists) completely retreated from scientific engagement with macroeconomists who disagreed with them and gave up on using evidence to evaluate models.

By: Jérémie Cohen-Setton Topic: European Macroeconomics & Governance Date: August 24, 2015
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

The decline in market liquidity

Has it become harder for buyers and sellers to transact without causing sharp price movements?

By: Jérémie Cohen-Setton Topic: Finance & Financial Regulation Date: August 12, 2015
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

Understanding the Neo-Fisherite rebellion

The idea that low interest rates are deflationary – that we’ve had the sign on monetary policy wrong! – started as a fringe theory on the corners of the blogosphere 3 years ago. Michael Woodford has now confirmed that modern theory, indeed, implies the Neo-Fisherian view when people’s expectations are infinitely rational.

By: Jérémie Cohen-Setton Topic: European Macroeconomics & Governance Date: July 19, 2015
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

Restructuring in the US currency union

On June 28, the governor of the Commonwealth territory announced that it would not be able to repay its debt. Puerto Rico has since asked Congress to change the law to make the tools that U.S. municipalities can use to restructure their debt through Chapter 9 available to its territory. 

By: Jérémie Cohen-Setton Topic: Global Economics & Governance Date: July 12, 2015
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

Wage and price inflation

What’s at stake: For the past few months, the Fed has been in a "wait-and-see" mode to assess the strength of the US recovery. In particular, it has been waiting for signs that employment gains translate into wage pressures before beginning its tightening cycle. Although wage gains remain useful to assess the amount of slack in the labor market, the connection between wage and price inflation appears less mechanical than in the past.

By: Jérémie Cohen-Setton Topic: Global Economics & Governance Date: July 5, 2015
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

The 4% growth target

What's at stake: To contrast with President Obama’s middle class economics, the Republican Party – from Rand Paul’s proposal to repeal the entire IRS code to Jeb Bush’s 4% growth target – is positioning itself as the party that can drastically expand potential growth in the 2016 presidential election.

By: Jérémie Cohen-Setton Topic: Global Economics & Governance Date: June 22, 2015
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

The empirical shift in economics

Rather than being unified by the application of the common behavioral model of the rational agent, economists increasingly recognize themselves in the careful application of a common empirical toolkit used to tease out causal relationships, creating a premium for papers that mix a clever identification strategy with access to new data.

By: Jérémie Cohen-Setton Topic: European Macroeconomics & Governance Date: June 14, 2015
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

The economics of parallel currencies

As Greece faces a severe shortage of euros, the idea of introducing a parallel currency used for some domestic transactions – while keeping the euro in place for existing bank deposits and for foreign transactions – has made a comeback.  Although historical examples of parallel currencies exist, the analysis of the idea remains in its infancy. It remains unclear whether and how one could find the right mechanics. 

By: Jérémie Cohen-Setton Topic: European Macroeconomics & Governance Date: June 8, 2015
Read article More by this author

Blog Post

Jérémie Cohen-Setton

Mathiness in economics

What’s at stake: Growth economist Paul Romer has caused a stir over the past few weeks in the blogosphere with a paper, first presented in January at the annual meeting of the American Economic Association and recently published, which argues that economic theory is becoming, more often than not, a sloppy mixture of words and symbols.

By: Jérémie Cohen-Setton Topic: European Macroeconomics & Governance, Global Economics & Governance Date: June 8, 2015
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

The residual seasonality puzzle

While seasonal adjustment is generally considered uninteresting, the repetition of low first quarter GDP releases since 2010 has led many to wonder if a predictable seasonal pattern remains in the published data. The U.S. Bureau of Economic Analysis is investigating the issue and will report on its findings in July.

By: Jérémie Cohen-Setton Date: May 26, 2015
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

Is the economy stationary?

What’s at stake: The question of whether capitalist economies are self-correcting and will eventually revert to mean growth has received renewed interest given the underperformance of most economies six years after the onset of the Great Recessions. While the idea of persistent high unemployment was central to Keynes’ General Theory, it was quickly abandoned by the neoclassical synthesis.

By: Jérémie Cohen-Setton Topic: European Macroeconomics & Governance Date: May 18, 2015
Read article More on this topic More by this author

Blog Post

Jérémie Cohen-Setton

Macroeconomic performance and election outcomes

The results of last week’s election in the UK have generated a debate on the role of macroeconomic performance in electoral success and, in particular, on whether elections hinge on an incumbent’s overall record or on whether things are improving in the election year.

By: Jérémie Cohen-Setton Topic: European Macroeconomics & Governance Date: May 12, 2015
Load more posts