Share with your friends


Analytics Magazine

Data Assets: Information decay

March/April 2014

How the value of information diminishes over time.

Dhira J. Rajaram Krishna Rupanagunta Aditya Kumbakonam


For those of you who still remember high school chemistry, you may recall that radioactive decay is an inherent property of all matter. And as a quantum physicist would tell you, while it is impossible to predict when a particular atom will decay, the chance that a given atom will decay is constant over time. We believe that the same principle holds true for information within an organization as well: While it is difficult to predict when a particular information entity (e.g., a set of data records) will lose its relevance for a decision-maker, it is certain that all information loses value over time.

The parallels are striking – so much so that we believe that every information entity should have an attribute called “information decay” that describes how the value of this information decreases over time, much like the half-life captures the rate of decay for all matter. As far as the information entity is concerned, this phenomenon has accelerated in recent times as advances in analytics, data and technology have transformed the way organizations leverage information to drive decisions. Thanks to an explosion in data, there is not only a lot of information, but the rate of information accumulation is accelerating as well. Combine that with a highly dynamic business environment, and it starts becoming clear that the value of each information entity is decreasing at an ever-faster rate.

There is no better example of information decay than that of the Oakland A’s from 1996-2004, famously storied in the book “Moneyball.” Starting in 1996, the team adopted a novel approach to scouting, driven by using an analytical, evidence-based (“sabermetric”) approach. The results were dramatic, as the A’s made it to the playoffs four straight years starting in 2000. Other teams then caught on, and the A’s lost their advantage rapidly – a case of rapid information decay.

An example of information decay that marketers would recognize is the half-life associated with GRPs/TRPs or impressions to create a derived metric – ad-stock, used to quantify the impact of marketing. The delayed effects of marketing campaigns have been well understood and have been successfully leveraged to measure short- and long-term effects on revenue and brand equity.

Why Does Information Decay?

Information decays for several reasons, and, as is usually the case, more than one of the following reasons is typically at play:

Information becomes outdated: In many situations, information has a temporal value that decays unless refreshed on a regular basis. Consumer credit scores, for instance, need to be continuously refreshed in order to retain their value, which is directly impacted by the refresh frequency. In a world where a combination of data and technologies make near real-time refreshes possible, the information decay of consumer credit scores is increasing.

Natural decay of information: As technologies evolve, some information elements begin to lose relevance. Traditionally, surveys were the preferred (often, the only) method for companies to get a pulse of their customer base. However, with the explosion of e-commerce and social media, companies are increasingly tapping multi-channel data sources to better understand the moments of truth in the customer lifecycle. They are tapping into these newer, richer sources for better customer insights, and in the process the information value of surveys is diminishing.

The “efficient information hypothesis”: In finance, the efficient market hypothesis posits that the prices of traded assets reflect all the available information. As the access to information increases, its value decays. This information decay is accelerating at an unprecedented rate, thanks to technology.

Information Decay
Every information entity should have an attribute called “information decay” that describes how the value of this information decreases over time.

A case in point is competitive pricing. Once upon a time, not very long ago, competitive pricing strategies kept scores of managers busy in organizations. And then came the Internet and with it, website scraping, which has given organizations the ability to track real-time changes to competitor prices. Big data technologies allow multiple retailers to dissect every price change in the ecosystem in near real time, sucking away any possible arbitrage opportunity. In other words, the Internet has accelerated the information decay rate of competitive pricing.

Another situation that is all too familiar for city dwellers is traffic information. As real-time information about traffic flows (or more likely snarls) becomes available, this triggers a bandwagon effect of redirecting the traffic to the hitherto unclogged routes, sucking them to the gridlock as well. The value of the traffic information comes down, and the speed with which this information is distributed determines its rate of information decay.

The “observer effect”: One of the more esoteric concepts in quantum physics, this refers to the changes that the very act of observation cause when any phenomenon is being observed. This is well known in stock markets – often, the very act of an analyst initiating coverage of a relatively unknown stock brings attention to the stock. And more eyes on the stock can change the dynamics of the stock, altering the information decay of the stock price. Until, of course, the efficient market hypothesis kicks in and brings the stock back to its natural levels.

Why Does Information Decay Matter?

Data is the “new oil” of the 21st century, and companies are fast accumulating data assets. Organizations need to invest in extracting the true value from data by institutionalizing a culture of data-driven decision-making. As they embark on this journey, managers would do well to recognize that the value of data, like any asset, depreciates over time.

To begin with, any data governance process should have a strong data value audit process. This should be a structured process that, on a pre-defined frequency, takes a critical look at every data element (from raw data to derived metrics) and evaluates its value. If it turns out that the value derived by business from that data element is decaying, follow-up with corrective action – either refresh the computation method to revise the data element or replace the data element completely. This alone should set companies well along the way to incorporate the concept of information decay into their data DNA.

Over time, we expect information decay – which we now refer to as “µDecay” – to be formalized as an attribute of every information entity. And once organizations begin to measure information decay and drive corrective actions, the value they can extract out of data assets will grow.

Dhiraj Rajaram is the CEO of Mu Sigma, an analytics services provider that provides services to more than 75 of Fortune 500 companies. Krishna Rupanagunta is the “geography head” and Aditya Kumbakonam is the “delivery head” at Mu Sigma.

business analytics news and articles


Analytics Blog

Electoral College put to the math test

With the campaign two months behind us and the inauguration of Donald Trump two days away, isn’t it time to put the 2016 U.S. presidential election to bed and focus on issues that have yet to be decided? Of course not.


SAS, IVMF team up to help veterans transition to analytics careers

Employers from every industry are clamoring for people with analytics skills to help them turn data into meaningful information to make better decisions. Military veterans and family members can pursue the hottest careers today, thanks to a collaboration between the Institute for Veterans and Military Families (IVMF) and analytics provider SAS. Read more →

Lehigh Univ., Pennsylvania Department of Corrections win Wagner Prize

Nearly 100 unique factors have to be considered during the complicated task of assigning inmates to any of the Pennsylvania Department of Corrections’ 25 facilities. What once took seven employees nearly a week to accomplish can now be completed in less than 10 minutes at an expected savings of nearly $3 million, thanks to an algorithm created by a team of Lehigh University students and professors and the Pennsylvania Department of Corrections. Read more →



2017 Winter Simulation Conference (WSC 2017)
Dec. 3-6, 2017, Las Vegas


CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:

For more information, go to