Share with your friends


Analytics Magazine

Predictive Analytics: TV ads, Wanamaker’s dilemma & analytics

March/April 2011


The journey from historical ratings to predictive decisions: A proposed evolution for buyers and sellers of television advertising.

Eric Fischer basu hubele levine

By (left to right) Eric Fischer, Atanu Basu, Joachim Hubele and Eric Levine

John Wanamaker, the father of modern advertising, famously stated in the early 20th century, “I know that half of my advertising dollars are wasted … I just don’t know which half.” Unfortunately, as marketing, advertising and media move at warp speed, the dilemma of this old adage has only gotten worse for even the most sophisticated advertisers.

If this adage is correct, exactly how much money does this represent? PricewaterhouseCoopers and Wikofsky Gruen and Associates’ “Global Entertainment and Media Outlook 2009-13” measures media advertising spending at $420 billion in 2009. With upticks in 2010 and a projected uptick in 2011, the marketplace is rapidly approaching half a trillion dollars. If Wanamaker’s quote is accurate, advertisers will soon be wasting $250 billion a year in media spending. Focusing on the largest medium of television, the 2009 figures represent a $150 billion marketplace, meaning $75 billion may have been wasted. It is safe to assume no other industry spends this sum of money without a definitive return on its investment. No wonder marketing expenditures keep compliance and financial officers up at night.

Marketers have traditionally fallen back on conventional wisdom that advertising is different from other organizational investing such as capital expenditures and human resources. They point out advertising is both science and art, and it’s the artful component of that equation that makes it very difficult, if not impossible, to effectively quantify the investment of those advertising expenditures.

The “holy grail” for most senior marketers is to successfully address Wanamaker’s dilemma and prove a causal relationship between advertising dollars spent and the ROI impact of that spending. Marketers attempting to pursue this objective must create a two-pronged approach properly assessing both the creative message and the media used to deliver the message. Marketers conduct exhaustive research to measure the effectiveness of the creative aspects of the messaging, but there is scant, independent research available to assist marketers in testing the efficacy and efficiency of the actual media used to deliver the message. To make an analogy, using improper media to deliver the proper creative is like a doctor administering the right vaccine, but using the wrong delivery system – say orally vs. injected – or worse, targeting the wrong cells. Either way the results are the same – an ineffective outcome with wasted resources.

If we focus on the delivery mechanism side of the equation – the actual media vehicles used to relay the message – it’s logical to begin with the largest and most popular medium in terms of spending: terrestrial, cable and satellite delivered television. The “resurgence” of television advertising (most in the business will laugh at the supposition it was ever diminished) suggests that reports of the “death” of the medium by some pundits was, like that of Mark Twain’s, “greatly exaggerated.”

Recent findings from Deloitte state, “Prophecies of the imminent obsolescence of television will again be proved wrong in 2011 and instead its status as a ‘super media’ will be reinforced.” According to Deloitte, 2011 will represent the fifth consecutive year of television advertising growth, despite the 2008-09 recession. Ed Shedd, lead media partner of Deloitte, said: “This year’s predictions show television’s continued strength, which continues to lead all media in total revenues, including advertising sales, subscriptions, pay-per-view and license fees. … In addition some 40 million new viewers will tune in for the first time and more than 140 billion more hours of content will be watched around the world.”

Figure 1: Television advertising share of total. (Source: Deloitte Touche Tohmatsu Limited analysis, ZenithOptimedia, advertising expenditure forecasts, December 2010.)
Figure 1: Television advertising share of total.

(Source: Deloitte Touche Tohmatsu Limited analysis, ZenithOptimedia, advertising expenditure forecasts, December 2010.)

With millions of new viewers and current audiences consuming more television than ever before, when we carefully examine the business of television advertising, we continue to find an industry predicated on little or no quantitative research helping its patrons make informed decisions.

With television, we’ve witnessed an antiquated, industry-wide standard of “ratings” become the singular source of data used to measure the effectiveness of the medium, but one that provides zero ROI-based guidance. And while there are secondary, qualitative variables layered on top of media measurements such as recall and attentiveness, these metrics largely measure the impact of the creative message, not the delivery method itself. When secondary measurement criteria are stripped out, the $150 billion spent is based on an arguably primitive, ineffective “ratings based” system developed half a century ago.

As people continue to spend time with television, one thing is clear – television has had, currently enjoys and will most likely continue to have a “most favored nation” status in the hearts and minds of consumers, and marketers will continue to spend heavily to reach their consumers in this medium. With a continually fragmented media landscape, television’s engagement with its audience is unique and one that provides marketers with an unsurpassed vehicle to brand their product or service. But as television continues its dominance, there needs to be significantly better measurement tools for chief marketing officers (CMOs) to know they are investing intelligently, they’re clear on the returns they’re getting, they understand why they’re getting those results, and they understand how they can use this information to make future decisions.

All this is not news to anyone in the industry as countless hours and millions of dollars have been spent in pursuit of solving Wanamaker’s dilemma. No doubt great strides have been made to more effectively measure the number of people watching and how they consume television advertising. Efforts to measure out of home viewing, time shifted viewing and the need to expand the definition of the 30-second ad unit to include online video viewership all make the industry of television buying more exact.

Strides have also been made by Nielsen and companies such as IRI and TRA to link television viewing and sales information in an effort to provide a greater level of accountability. There have been misses in this process, notably the much ballyhooed “Project Apollo” in the mid-2000s, but the industry is on the right track in trying to allay the ROI issue faced by television advertisers.

Can we establish a direct link between dollars spent and ROI? Can we solve the problem of relying on ratings-based historical spending? Is there a technology to allow senior marketers to gain insights and assist them to make more informed purchasing decisions and make their next purchase more effective and efficient? Is there a role for predictive analytics and predictive decision-making in this process?

The Marketers’ Disadvantage

Most marketers lament the use of ratings points as the determinant for what is often their largest single capital expenditure – television advertising spending. Solely relying on this data greatly handicaps them, versus their peers, as their data is arguably the least effective data used across the entire organization. Where their counterparts use historical data to provide a framework for future decisions, marketers can only use their largest expenditure to track whether media vendors delivered their commitments, regardless of the results of those dollars spent. Clients, with their agencies, negotiate the vast majority of their media commitments with vendors on metrics such as the cost per reaching 1,000 people (“CPM” – cost per thousand using the Roman numeral ‘M’ for thousand) or the cost to purchase one gross ratings point (“CPP” – cost per point).

While CPM and CPP provide a point of negotiations, they provide no ability to pre-select audience characteristics or qualify people the client would like to reach with the advertising message. Marketers may use aided and unaided recall testing on the messaging, but there is no industry standard determining whether the audience they paid for paid attention to the product or service being promoted. Instead ratings data simply provides the number of people capable of being reached by the advertising, and the metrics of CPM and CPP are no more than negotiating terms.

While using historical data as the basis for future decision-making is tantamount to driving a car looking through the rearview mirror, the marketer is driving the car blindfolded; historical ratings are completely reactive and provide no forward facing information and no utility for future decision-making. The best the marketer can hope for in terms of truly measuring the effectiveness of this key channel is the performance index of direct response. But for the many marketers who look to television to drive qualitative metrics such as brand preference, net promotion and awareness versus driving pure transactional behavior, we’re back to the broken measurement framework of ratings.

Efforts to provide deeper metrics for television viewing are well intentioned and will hopefully prove successful in providing more substantial research for marketers. But are they going far enough? Ultimately, current efforts are being focused on the “what.” What is the ROI relationship between television advertising and the marketer’s goals from that advertising? Answering the “what” question is a great leap forward, but it only allows marketers to take off that blindfold and join their counterparts in driving the car looking through that rearview mirror.

To begin to learn how to drive looking through the windshield, marketers must look past the “what.” Through a deeper interpretation of data, they must begin to ask the questions of “why can this relationship be claimed” and “if I know the why, how do I use that to model future business decisions.” If this is possible, we can get closer to solving Wanamaker’s dilemma than ever before. The answer may lie in advanced analytics – the journey from descriptive to predictive to prescriptive analytics.

Potential Role of Predictive Decision Management

In this analogy, predictive decision management (PDM) is akin to driving while looking through the windshield. PDM is enabled by combining two powerful technology disciplines: predictive analytics (what will happen, when and why) and decision management (how to take advantage of this foresight). Foresight, though a game changer by itself compared to the current state of television advertising, needs to be combined with prescriptive actions to ensure maximum ROI for television advertisers. That’s the promise of PDM to CMOs.

PDM, the technology that makes forward-looking decisions possible, is a young but quickly maturing discipline. It is like a GPS on steroids, a much more enhanced version of the GPS we now use in our cars. With your GPS you can define a future goal, which is your destination. Your GPS will get you there by taking into account past data (e.g., street map), current data (e.g., traffic and weather conditions) and preferences (e.g., avoid toll roads). Now think about this: What if your GPS knew how much gas is in your fuel tank, how efficient is your car’s engine, what type of a driver you are, how the traffic conditions will be and other valuable information? Then your GPS could not only get you to your destination on time and in one piece, it could do so while taking into account what’s ahead in the journey and what you can and can’t do. CMOs work under timelines and constraints, and have a finite set of knobs they can turn. Hence, taking these business realities into account is essential for any technology, if it were to address the ROI conundrum these CMOs are faced with in television advertising.

Two important constituencies in the TV advertising equation – advertisers and vendors – can benefit from predictive decision-making. Advertisers, of course, are the companies such as P&G, Unilever, Nike, BMW, etc. that advertise their products and services on TV. Vendors are the companies selling the TV spots to advertisers, and they can be subdivided into three groups: networks (ABC, NBC, CBS, FOX, etc.), syndicators (FOX, Paramount, Sony, Warner Brothers, etc.) and cable channels (MTV, ESPN, TNT, etc.).

For advertisers, the goal of working with PDM is to improve the effectiveness and efficiency of their television media spending. PDM provides a solution on two fronts. First, a more efficient television buy allows CMOs the ability to make better use of the largest expenditure. PDM can assist them in buying more efficiently by allowing them to either allocate dollars against other marketing resources or buy more television-based advertising with the same allocation of resources. Secondly, PDM provides the marketer a more exact glimpse into current marketplace dynamics to better assess future markets and enter negotiations with better intelligence. PDM requires taking into account many internal and external datasets and business rules, things that are within the advertisers’ management control and things that are not. Datasets may include Nielsen Ratings, DVR usage, specifications of the product to be advertised, specifications of competing products in the market, advertising performance of competing products, target addressable markets (TAMs), target customers in these TAMs, econometrics information, sentiment forecasts, etc. At a “macro” level, the objective is to answer the following three ultimate questions for the advertiser, before she makes her purchase: How should I allocate my television media spending? Why? How will these perform?

For vendors (who have a fixed supply of inventory and continuously need to project how to maximize the revenue potential of their inventory), PDM requires taking into account more, and better, datasets and business rules. Vendors’ “market intelligence” currently relies on rudimentary macroeconomic data, financial news about their customers and anecdotal conjecture on future spending level from the clients and agencies, who are their negotiating adversaries. PDM provides a more reliable way to project their business by including datasets such as historical ad performance (Nielsen Ratings), historical pricing, market forecasts, econometrics information, sentiment forecasts, competitive dynamics, etc. The algorithms involved may need to be refreshed more frequently for the more opportunistic “scatter market,” which represents about 25 percent of the market, than the “upfront market” which has longer lead times and represents about 75 percent of the market. At a “macro” level, the objective is to answer the following questions for the vendor before she sells her available ad inventory to advertisers: How to specifically price the items in my ad inventory? For whom (target advertisers)? When is the ideal time to sell the inventory to maximize revenue, profit, etc.?

Would you drive without your GPS? Of course, many people have commuted on the same stretch of road for years. But unlike your commute route, the territory is changing constantly in today’s business environment. Under these conditions your business does need a GPS, an enhanced version of it, to provide a specific action plan for the future so you can not only anticipate what’s ahead, but you can also take advantage of what’s ahead.

Are we any closer to solving Wanamaker’s dilemma? If we can start talking about the “why” and “how,” perhaps the nature of the conversation will change from 1950s terminology such as CPMs, CPPs and ratings to more reliable – predictive and prescriptive – insights and actions that provide clients, agencies and vendors a more efficient and effective way to do business. PDM may one day hold the key to this $75 billion dollar question.

Eric Fischer ( is an adjunct professor at the Walter Cronkite School of Journalism & Mass Communication at Arizona State University and founder of HJA Strategic Consulting LLC. He is a 20-year veteran in television advertising, including senior positions at global agencies (McCann-Erickson Worldwide), media vendors (Disney, Fox, the NBA and Tribune) and clients (American Home Products Corp., now Wyeth Labs). His blog series explains the business of TV Advertising.

Atanu Basu ( is the CEO of DataInfoCom, an analytics software company headquartered in Austin, Texas. DataInfoCom’s customers include Dell, Microsoft, Cisco Systems and Juniper Networks. Basu has 17 years of experience in the semiconductor and the software industry, almost all of it in new technology development and sales. He holds a master’s degree in engineering.

Joachim Hubele ( is the director of technology at DataInfoCom. He has 20 years of experience in the technology and the telecom industry, most of it software engineering, platform architecture and business analytics. He holds a Ph.D. in physics and has taught graduate-level courses in data mining and analytics.

Eric Levine ( is president and founder of Woodnote Marketing, a brand positioning and marketing consultancy in Austin, Texas. With more than 20 years of marketing experience, he has held roles including director of marketing for Dell and CMO at enterprise software company Trilogy, following a decade in the advertising industry serving F500 brands including IBM, UPS and Delta Air Lines.



Related Posts

  • 44
    Use of the term “business analytics” is being used within the information technology industry to refer to the use of computing to gain insight from data. The data may be obtained from a company’s internal sources, such as its enterprise resource planning application, data warehouses/marts, from a third party data…
    Tags: analytics, data, business, predictive, prescriptive
  • 39
    The convergence of marketing technology (Martech) and advertising technology (Adtech), event-triggered and real-time marketing techniques, personalization and the use of contextual clues are the four key forces that point to a data-centric future for marketers, according to a recent report from Gartner.
    Tags: marketing, marketers, analytics, predictive, advertising, future, data
  • 39
    Does advertising work? Few will deny that advertising plays an important role in building awareness. The idiom, “out of sight, out of mind,” speaks to the importance of being seen in order to even be thought of. Looking back over the years, however, there’s a strong case to be made…
    Tags: advertising, analytics, marketing
  • 38
    Averages lie to you. One of our publishing clients looked at the average sell-through rate of its online advertising inventory and noted it was 70 percent. “We can launch a metered paywall, and as long as we do not lose more than 30 percent of our inventory, the lost advertising…
    Tags: advertising, data, business, analytics, marketing
  • 35
    As the Big Data Analytics space continues to evolve, one of the breakthrough technologies that businesses will be talking about in the coming years is prescriptive analytics. The promise of prescriptive analytics is certainly alluring: it enables decision-makers to not only look into the future of their mission critical processes…
    Tags: analytics, prescriptive, data, business, future


Using machine learning and optimization to improve refugee integration

Andrew C. Trapp, a professor at the Foisie Business School at Worcester Polytechnic Institute (WPI), received a $320,000 National Science Foundation (NSF) grant to develop a computational tool to help humanitarian aid organizations significantly improve refugees’ chances of successfully resettling and integrating into a new country. Built upon ongoing work with an international team of computer scientists and economists, the tool integrates machine learning and optimization algorithms, along with complex computation of data, to match refugees to communities where they will find appropriate resources, including employment opportunities. Read more →

Gartner releases Healthcare Supply Chain Top 25 rankings

Gartner, Inc. has released its 10th annual Healthcare Supply Chain Top 25 ranking. The rankings recognize organizations across the healthcare value chain that demonstrate leadership in improving human life at sustainable costs. “Healthcare supply chains today face a multitude of challenges: increasing cost pressures and patient expectations, as well as the need to keep up with rapid technology advancement, to name just a few,” says Stephen Meyer, senior director at Gartner. Read more →

Meet CIMON, the first AI-powered astronaut assistant

CIMON, the world’s first artificial intelligence-enabled astronaut assistant, made its debut aboard the International Space Station. The ISS’s newest crew member, developed and built in Germany, was called into action on Nov. 15 with the command, “Wake up, CIMON!,” by German ESA astronaut Alexander Gerst, who has been living and working on the ISS since June 8. Read more →



INFORMS Computing Society Conference
Jan. 6-8, 2019; Knoxville, Tenn.

INFORMS Conference on Business Analytics & Operations Research
April 14-16, 2019; Austin, Texas

INFORMS International Conference
June 9-12, 2019; Cancun, Mexico

INFORMS Marketing Science Conference
June 20-22; Rome, Italy

INFORMS Applied Probability Conference
July 2-4, 2019; Brisbane, Australia

INFORMS Healthcare Conference
July 27-29, 2019; Boston, Mass.

2019 INFORMS Annual Meeting
Oct. 20-23, 2019; Seattle, Wash.

Winter Simulation Conference
Dec. 8-11, 2019: National Harbor, Md.


Advancing the Analytics-Driven Organization
Jan. 28–31, 2019, 1 p.m.– 5 p.m. (live online)


CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:

For more information, go to