Myths of Analytics: 10 myths of analytics and insights
10 myths of analytics and insights
By Will Towler
With so much attention given to analytics and insights over the past few years, revisiting some of the industry’s longstanding myths seems in order. Whether you work with traditional consumer research methods or emerging business intelligence techniques, here are 10 fallacies to keep in mind to make the most out of your efforts:
1. Scientific evidence is proof
History is full of inaccurate predictions and once believable theories that turned out to be wrong. Even Einstein, whose name is now synonymous with genius, at one point embraced the now obsolete static universe model. (Read more.) In marketing, perhaps no other blunder is more famous than the New Coke launch in 1985. Taste tests with close to 200,000 consumers indicated New Coke’s taste was more popular than the original formula. However, Coca-Cola executives failed to consider the possibility that the importance of brand heritage trumped taste, and eventually the company reintroduced the original formula in response to negative public reaction. (Read more.)
Takeaway: Science has always been subject to error, requiring an open mind to alternative possibilities.
2. Knowledge is power
In their book, “Big Data: A Revolution That Will Transform How We Live, Think, And Work,” Viktor Mayer-Schonberger and Kenneth Cukier contend that it doesn’t matter “why” there’s correlation, just that there “is” correlation. They provide an example of greater Pop-Tart sales during storms, a learning that WalMart has used to better merchandize. (Read more.) According to the authors, simply knowing that Pop-Tart sales are likely to increase is sufficient and doesn’t necessitate deeper investigation into underlying causes.
While there’s no doubt this finding has valuable business implications, relying on correlations alone can limit broader applications and can even entail risk. Case in point: the 2008 financial crisis. Analysis of AAA rated Collateralized Debt Obligations suggested they were sound investments (i.e., AAA = safe), but deeper analysis would have revealed danger. (Read more.)
Takeaway: As the role of data in decision-making increases, never before has understanding underlying relationships been more important.
3. Correlation measures relationship strength
A strong case can be made for questioning the extent to which correlation reflects a relationship between even seemingly interdependent variables. Consider U.S. healthcare expenditures and deaths from heart disease. Between 1960 and 2010, spending on healthcare in the United States increased more than seven times after adjusting for inflation. In lockstep, deaths due to heart disease more than halved, making it easy to conclude that the two are closely related. Medical care advances are believed to have contributed to lower heart disease-related deaths through improved diagnosis and treatment. However, general lifestyle and diet changes also played significant roles. (Read more.) Furthermore, a review of other chronic disease trends reveals that some medical conditions, such as diabetes, have worsened (see Figure 1). There’s also myriad other factors potentially related to escalating healthcare costs, such as an aging population, greater administrative expenses and broader marketing pressures.
For another take on how correlations potentially mask deeper relationships, check out Christopher Knittel and Aaron Smith’s paper “Ethanol Production And Gasoline Prices: A Spurious Correlation.”
Takeaway: Seriously consider whether correlation truly reflects a relationship or simply masks the influence of one or more hidden intervening variables.
|Figure 1: Correlations can potentially mask deeper relationships.|
4. Random sampling ensures representation
Unless you’re working with full universe coverage, some form of sampling is usually required. And while fantastic in theory, true randomness is difficult to achieve. Transactional data are constrained by membership and/or opt-outs; surveys face non-response; and social media content is subject to issues related to self-reporting. Beyond sampling, the challenge of unbiased representation is exacerbated by a number of factors ranging from human predispositions to herd mentality.
Sinan Aral wrote a convincing piece in the MIT Sloan Management Review, for example, explaining the tendency for online customer reviews to be abnormally j-shaped rather than bell-curved. Referencing different studies, Aral explains how herd mentality can lead to a disproportionate concentration of positive ratings skewing online reviews over short and long terms. (Read more.) It’s another example of how things aren’t always what they appear to be.
Takeaway: Accurate representation without some form of post hoc control is frequently illusive.
5. People are rational
Humans act irrationally. As consumers, we often derive greater satisfaction from the same item if it costs more (not less); we let decoy options cause us to make suboptimal decisions (such as buying something bigger than we normally would); and we frequently stick to what we’re most familiar with (even if another choice is better for us). The popularity of behavioral economics reflects a growing interest in tapping irrationality for business and government. There are still plenty of questions about how best to merge economics and psychology, but their immutable relationship beyond traditional utilitarian theory is without doubt. (Read more.)
Takeaway: Even the tightest models are fallible due to the unpredictability of human behavior.
6. Recall is a trusted form of memory
It doesn’t take advanced science to know that our memories are imperfect, often causing us to forget information and events. University of Illinois professor Brian Gonsalves conducted research in which he found subjects were able to recall information correctly only 54 percent of the time after being shown simple images and descriptions. Gonsalves explains that some people may remember the general context of an event, but may not encode specific details. (Read more.) Another study on the effectiveness of different advertising research methodologies found that less than 70 percent of respondents were able to accurately recall ad exposure. The research also revealed that up to 25 percent of those who were not shown an ad incorrectly recalled exposure (see Figure 2). (Read more.)
Takeaway: Trusting recall as a research methodology brings with it serious limitations.
|Figure 2: Not so total recall.|
7. You can’t manage what you can’t measure
A century ago Lord Leverholm famously said, “I know half my advertising isn’t working, I just don’t know which half.” The uncertainty that comes with advertising will probably never go away. One of the key reasons is the inability to accurately measure priming, in which exposure to one stimulus influences response to another. While the extent of priming is debated, that it exists is not. Our environment influences our subconsciousness through sights, sounds and smells. (Read more.) In addition, the effects of priming can be long-lasting, well beyond what one might expect. (Read more.)
Takeaway: The return on some investments is not always immediate, often making short-term measurement a futile task.
8. Sales and marketing metrics are different
Sales and marketing share the same overarching objective of driving profitable growth. Having common or at least integrated key performance indicators would seem reasonable, but sales and marketing often track different metrics reviewed in isolation (e.g., sales might focus on funnel activities while marketing might be more concerned with brand and campaign assessment). Common or at least integrated metrics not only align teams but also ensure what’s being measured is most relevant.
Strategy and customer experience consultant Christine Crandell recommends three groups of measures including end-to-end conversion, revenue diversity and outcome profitability. According to Crandell, “While there are literally hundreds of sales and marketing metrics that can be used, it comes down to (these) three that measure alignment and frame that all-too critical joint conversation with sales and marketing about what’s working and what isn’t.” (Read more.)
Takeaway: Sales and marketing metrics should be shared or at least integrated in order to work toward the same goal and evaluate performance holistically.
9. Sound analytics drive sound decision-making
The Corporate Executive Board recently conducted a study of 5,000 employees at 22 global companies and found that just over one-third of participants balance judgment and analysis, key to effective decision-making. The other two-thirds either go exclusively with their gut or trust analysis over judgment. The study highlights the challenge that even sound insights can have in driving good decision-making due to broader organizational issues. According to an accompanying article in the Harvard Business Review, “At this very moment, there’s an odds-on chance that someone in your organization is making a poor decision on the basis of information that was enormously expensive to collect.” (Read more.)
Takeaway: Quality output is only half the battle when it comes to impactful analytics and insights. The other half of the battle lies in an organization’s ability to take action effectively.
10. Great insights sell themselves
You conducted a momentous research project or developed a groundbreaking business intelligence system. Why wouldn’t your work attract fans and lead to positive change? Industry track records, however, suggest most projects fail to accomplish their goals. (Read more.) Poor problem definition and operational snafus are common challenges. But even projects with clearly defined objectives and smooth implementation can fall short of expectations in the absence of effective communication and stakeholder engagement.
According to the Project Management Institute, among organizations considered highly effective communicators, 80 percent of projects meet original goals vs. only 52 percent at their minimally effective counterparts. (Read more.)
Takeaway: Projects without a well-crafted stakeholder engagement and communication plan will likely have little chance of success from the get-go.
New data sources and diagnostic capabilities continue to enhance the potential value available through analytics and insights. However, the longstanding truths of how to make an impact haven’t and likely won’t change. The fundamentals of statistics, human nature, stakeholder engagement and communication remain the same; and effectively leveraging their constructs largely determines project success.
Will Towler is an analytics and insights consultant working in the Seattle area, and he has nearly 20 years experience in consumer research and business intelligence. For more, visit his website: www.insighttrends.com.