Share with your friends










Submit

Analytics Magazine

Data Assets: Information decay

March/April 2014

How the value of information diminishes over time.

Dhira J. Rajaram Krishna Rupanagunta Aditya Kumbakonam

By (l-r) DHIRAJ RAJARAM,
KRISHNA RUPANAGUNTA
AND ADITYA KUMBAKONAM

For those of you who still remember high school chemistry, you may recall that radioactive decay is an inherent property of all matter. And as a quantum physicist would tell you, while it is impossible to predict when a particular atom will decay, the chance that a given atom will decay is constant over time. We believe that the same principle holds true for information within an organization as well: While it is difficult to predict when a particular information entity (e.g., a set of data records) will lose its relevance for a decision-maker, it is certain that all information loses value over time.

The parallels are striking – so much so that we believe that every information entity should have an attribute called “information decay” that describes how the value of this information decreases over time, much like the half-life captures the rate of decay for all matter. As far as the information entity is concerned, this phenomenon has accelerated in recent times as advances in analytics, data and technology have transformed the way organizations leverage information to drive decisions. Thanks to an explosion in data, there is not only a lot of information, but the rate of information accumulation is accelerating as well. Combine that with a highly dynamic business environment, and it starts becoming clear that the value of each information entity is decreasing at an ever-faster rate.

There is no better example of information decay than that of the Oakland A’s from 1996-2004, famously storied in the book “Moneyball.” Starting in 1996, the team adopted a novel approach to scouting, driven by using an analytical, evidence-based (“sabermetric”) approach. The results were dramatic, as the A’s made it to the playoffs four straight years starting in 2000. Other teams then caught on, and the A’s lost their advantage rapidly – a case of rapid information decay.

An example of information decay that marketers would recognize is the half-life associated with GRPs/TRPs or impressions to create a derived metric – ad-stock, used to quantify the impact of marketing. The delayed effects of marketing campaigns have been well understood and have been successfully leveraged to measure short- and long-term effects on revenue and brand equity.

Why Does Information Decay?

Information decays for several reasons, and, as is usually the case, more than one of the following reasons is typically at play:

Information becomes outdated: In many situations, information has a temporal value that decays unless refreshed on a regular basis. Consumer credit scores, for instance, need to be continuously refreshed in order to retain their value, which is directly impacted by the refresh frequency. In a world where a combination of data and technologies make near real-time refreshes possible, the information decay of consumer credit scores is increasing.

Natural decay of information: As technologies evolve, some information elements begin to lose relevance. Traditionally, surveys were the preferred (often, the only) method for companies to get a pulse of their customer base. However, with the explosion of e-commerce and social media, companies are increasingly tapping multi-channel data sources to better understand the moments of truth in the customer lifecycle. They are tapping into these newer, richer sources for better customer insights, and in the process the information value of surveys is diminishing.

The “efficient information hypothesis”: In finance, the efficient market hypothesis posits that the prices of traded assets reflect all the available information. As the access to information increases, its value decays. This information decay is accelerating at an unprecedented rate, thanks to technology.

Information Decay
Every information entity should have an attribute called “information decay” that describes how the value of this information decreases over time.

A case in point is competitive pricing. Once upon a time, not very long ago, competitive pricing strategies kept scores of managers busy in organizations. And then came the Internet and with it, website scraping, which has given organizations the ability to track real-time changes to competitor prices. Big data technologies allow multiple retailers to dissect every price change in the ecosystem in near real time, sucking away any possible arbitrage opportunity. In other words, the Internet has accelerated the information decay rate of competitive pricing.

Another situation that is all too familiar for city dwellers is traffic information. As real-time information about traffic flows (or more likely snarls) becomes available, this triggers a bandwagon effect of redirecting the traffic to the hitherto unclogged routes, sucking them to the gridlock as well. The value of the traffic information comes down, and the speed with which this information is distributed determines its rate of information decay.

The “observer effect”: One of the more esoteric concepts in quantum physics, this refers to the changes that the very act of observation cause when any phenomenon is being observed. This is well known in stock markets – often, the very act of an analyst initiating coverage of a relatively unknown stock brings attention to the stock. And more eyes on the stock can change the dynamics of the stock, altering the information decay of the stock price. Until, of course, the efficient market hypothesis kicks in and brings the stock back to its natural levels.

Why Does Information Decay Matter?

Data is the “new oil” of the 21st century, and companies are fast accumulating data assets. Organizations need to invest in extracting the true value from data by institutionalizing a culture of data-driven decision-making. As they embark on this journey, managers would do well to recognize that the value of data, like any asset, depreciates over time.

To begin with, any data governance process should have a strong data value audit process. This should be a structured process that, on a pre-defined frequency, takes a critical look at every data element (from raw data to derived metrics) and evaluates its value. If it turns out that the value derived by business from that data element is decaying, follow-up with corrective action – either refresh the computation method to revise the data element or replace the data element completely. This alone should set companies well along the way to incorporate the concept of information decay into their data DNA.

Over time, we expect information decay – which we now refer to as “µDecay” – to be formalized as an attribute of every information entity. And once organizations begin to measure information decay and drive corrective actions, the value they can extract out of data assets will grow.


Dhiraj Rajaram is the CEO of Mu Sigma, an analytics services provider that provides services to more than 75 of Fortune 500 companies. Krishna Rupanagunta is the “geography head” and Aditya Kumbakonam is the “delivery head” at Mu Sigma.

business analytics news and articles

 

Analytics Blog

Electoral College put to the math test


With the campaign two months behind us and the inauguration of Donald Trump two days away, isn’t it time to put the 2016 U.S. presidential election to bed and focus on issues that have yet to be decided? Of course not.


Headlines

Gaining distribution in small retail formats brings big payoffs

Small retail formats with limited assortments such as Save-A-Lot and Aldi and neighborhood stores like Target Express have been growing in popularity in the United States and around the world. For brands, the limited assortments mean greater competition for shelf-space, raising the question of whether it is worth expending marketing effort and slotting allowances to get on to their shelves. According to a forthcoming study in a leading INFORMS scholarly marketing journal, Marketing Science, the answer is “yes.” Read more →

Cognitive computing a disruptive force, but are CMOs ready?

While marketing and sales professionals increasingly find themselves drowning in data, a new IBM study finds that 64 percent of surveyed CMOs and sales leaders believe their industries will be ready to adopt cognitive technologies in the next three years. However, despite this stated readiness, the study finds that only 24 percent of those surveyed believe they have strategy in place to implement these technologies today. Read more →

How weather can impact consumer purchase response to mobile ads

Among the many factors that impact digital marketing and online advertising strategy, a new study in the INFORMS journal Marketing Science provides insight to a growing trend among firms and big brands: weather-based advertising. According to the study, certain weather conditions are more amenable for consumer responses to mobile marketing efforts, while the tone of the ad content can either help or hurt such response depending on the current local weather. Read more →

UPCOMING ANALYTICS EVENTS

INFORMS-SPONSORED EVENTS

Essential Practice Skills for High-Impact Analytics Projects
Sept. 26-27, Executive Conference Center, Arlington, Va.

Foundations of Modern Predictive Analytics
Oct. 2-3, VT Executive Briefing Center, Arlington, Va.

2017 INFORMS Annual Meeting
October 22-25, 2017, Houston

2017 Winter Simulation Conference (WSC 2017)
Dec. 3-6, 2017, Las Vegas

CAP® EXAM SCHEDULE

CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:


 
For more information, go to 
https://www.certifiedanalytics.org.