Share with your friends










Submit

Analytics Magazine

Data Assets: Information decay

March/April 2014

How the value of information diminishes over time.

Dhira J. Rajaram Krishna Rupanagunta Aditya Kumbakonam

By (l-r) DHIRAJ RAJARAM,
KRISHNA RUPANAGUNTA
AND ADITYA KUMBAKONAM

For those of you who still remember high school chemistry, you may recall that radioactive decay is an inherent property of all matter. And as a quantum physicist would tell you, while it is impossible to predict when a particular atom will decay, the chance that a given atom will decay is constant over time. We believe that the same principle holds true for information within an organization as well: While it is difficult to predict when a particular information entity (e.g., a set of data records) will lose its relevance for a decision-maker, it is certain that all information loses value over time.

The parallels are striking – so much so that we believe that every information entity should have an attribute called “information decay” that describes how the value of this information decreases over time, much like the half-life captures the rate of decay for all matter. As far as the information entity is concerned, this phenomenon has accelerated in recent times as advances in analytics, data and technology have transformed the way organizations leverage information to drive decisions. Thanks to an explosion in data, there is not only a lot of information, but the rate of information accumulation is accelerating as well. Combine that with a highly dynamic business environment, and it starts becoming clear that the value of each information entity is decreasing at an ever-faster rate.

There is no better example of information decay than that of the Oakland A’s from 1996-2004, famously storied in the book “Moneyball.” Starting in 1996, the team adopted a novel approach to scouting, driven by using an analytical, evidence-based (“sabermetric”) approach. The results were dramatic, as the A’s made it to the playoffs four straight years starting in 2000. Other teams then caught on, and the A’s lost their advantage rapidly – a case of rapid information decay.

An example of information decay that marketers would recognize is the half-life associated with GRPs/TRPs or impressions to create a derived metric – ad-stock, used to quantify the impact of marketing. The delayed effects of marketing campaigns have been well understood and have been successfully leveraged to measure short- and long-term effects on revenue and brand equity.

Why Does Information Decay?

Information decays for several reasons, and, as is usually the case, more than one of the following reasons is typically at play:

Information becomes outdated: In many situations, information has a temporal value that decays unless refreshed on a regular basis. Consumer credit scores, for instance, need to be continuously refreshed in order to retain their value, which is directly impacted by the refresh frequency. In a world where a combination of data and technologies make near real-time refreshes possible, the information decay of consumer credit scores is increasing.

Natural decay of information: As technologies evolve, some information elements begin to lose relevance. Traditionally, surveys were the preferred (often, the only) method for companies to get a pulse of their customer base. However, with the explosion of e-commerce and social media, companies are increasingly tapping multi-channel data sources to better understand the moments of truth in the customer lifecycle. They are tapping into these newer, richer sources for better customer insights, and in the process the information value of surveys is diminishing.

The “efficient information hypothesis”: In finance, the efficient market hypothesis posits that the prices of traded assets reflect all the available information. As the access to information increases, its value decays. This information decay is accelerating at an unprecedented rate, thanks to technology.

Information Decay
Every information entity should have an attribute called “information decay” that describes how the value of this information decreases over time.

A case in point is competitive pricing. Once upon a time, not very long ago, competitive pricing strategies kept scores of managers busy in organizations. And then came the Internet and with it, website scraping, which has given organizations the ability to track real-time changes to competitor prices. Big data technologies allow multiple retailers to dissect every price change in the ecosystem in near real time, sucking away any possible arbitrage opportunity. In other words, the Internet has accelerated the information decay rate of competitive pricing.

Another situation that is all too familiar for city dwellers is traffic information. As real-time information about traffic flows (or more likely snarls) becomes available, this triggers a bandwagon effect of redirecting the traffic to the hitherto unclogged routes, sucking them to the gridlock as well. The value of the traffic information comes down, and the speed with which this information is distributed determines its rate of information decay.

The “observer effect”: One of the more esoteric concepts in quantum physics, this refers to the changes that the very act of observation cause when any phenomenon is being observed. This is well known in stock markets – often, the very act of an analyst initiating coverage of a relatively unknown stock brings attention to the stock. And more eyes on the stock can change the dynamics of the stock, altering the information decay of the stock price. Until, of course, the efficient market hypothesis kicks in and brings the stock back to its natural levels.

Why Does Information Decay Matter?

Data is the “new oil” of the 21st century, and companies are fast accumulating data assets. Organizations need to invest in extracting the true value from data by institutionalizing a culture of data-driven decision-making. As they embark on this journey, managers would do well to recognize that the value of data, like any asset, depreciates over time.

To begin with, any data governance process should have a strong data value audit process. This should be a structured process that, on a pre-defined frequency, takes a critical look at every data element (from raw data to derived metrics) and evaluates its value. If it turns out that the value derived by business from that data element is decaying, follow-up with corrective action – either refresh the computation method to revise the data element or replace the data element completely. This alone should set companies well along the way to incorporate the concept of information decay into their data DNA.

Over time, we expect information decay – which we now refer to as “µDecay” – to be formalized as an attribute of every information entity. And once organizations begin to measure information decay and drive corrective actions, the value they can extract out of data assets will grow.


Dhiraj Rajaram is the CEO of Mu Sigma, an analytics services provider that provides services to more than 75 of Fortune 500 companies. Krishna Rupanagunta is the “geography head” and Aditya Kumbakonam is the “delivery head” at Mu Sigma.

business analytics news and articles

 



Headlines

Meet CIMON, the first AI-powered astronaut assistant

CIMON, the world’s first artificial intelligence-enabled astronaut assistant, made its debut aboard the International Space Station. The ISS’s newest crew member, developed and built in Germany, was called into action on Nov. 15 with the command, “Wake up, CIMON!,” by German ESA astronaut Alexander Gerst, who has been living and working on the ISS since June 8. Read more →

Yale research on immigration, aging runners makes news

A recent study by Yale University professor and former INFORMS President Edward H. Kaplan (photo) and Yale colleague Jonathan Feinstein and Mohammad M. Fazel-Zarandi of MIT suggests that the number of undocumented immigrants in the United States is nearly twice as many as experts previously thought. Since its publication last month, the study, which estimates the number of such immigrants at 22.1 million instead of 11.3 million, has garnered worldwide attention from major media outlets including the Los Angeles Times, the Boston Globe, Fox News, Bloomberg News and the Daily Mail. Read more →

New salary survey paints optimistic picture for analytics professionals

Harnham, a global leader in data and analytics recruitment, recently released the 2018 editions of its salary guides for the United Kingdom, the United States and Europe. Having heard from thousands of data and analytics professionals across the globe, Harnham has gained an invaluable insight into key industry salaries and trends across a wide variety of analytics specialties and sectors. Read more →

UPCOMING ANALYTICS EVENTS

INFORMS-SPONSORED EVENTS

Winter Simulation Conference
Dec. 9-12, 2018, Gothenburg, Sweden

INFORMS Computing Society Conference
Jan. 6-8, 2019; Knoxville, Tenn.

INFORMS Conference on Business Analytics & Operations Research
April 14-16, 2019; Austin, Texas

INFORMS International Conference
June 9-12, 2019; Cancun, Mexico

INFORMS Marketing Science Conference
June 20-22; Rome, Italy

INFORMS Applied Probability Conference
July 2-4, 2019; Brisbane, Australia

INFORMS Healthcare Conference
July 27-29, 2019; Boston, Mass.

2019 INFORMS Annual Meeting
Oct. 20-23, 2019; Seattle, Wash.

Winter Simulation Conference
Dec. 8-11, 2019: National Harbor, Md.

OTHER EVENTS

Applied AI & Machine Learning | Comprehensive
Dec. 3, 2018 (live online)


Advancing the Analytics-Driven Organization
Jan. 28–31, 2019, 1 p.m.– 5 p.m. (live online)

CAP® EXAM SCHEDULE

CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:


 
For more information, go to 
https://www.certifiedanalytics.org.