Share with your friends


Analytics Magazine

Executive Edge: Closing the gap between analytics and action

November/December 2011

Chris FryBy Chris Fry

Analytics is on the minds of executives more than ever before. Data are becoming more accessible, computing power is increasing to the point where we can actually crunch those data, and analytical software tools are becoming increasingly powerful and cost-effective as well [1].

With such an ideal environment, analytics should be having a massive impact on business performance. Many analytics efforts do have great impact [2], but too many seem to never reach their full potential. How many times have you heard the story of the analyst who comes up with a great concept or model, but the affected decision-makers have never heard anything about it, cannot understand it or are unable to “operationalize” it?

In my view, such efforts do not reach their full potential because the analysts and project managers haven’t closed the gap between analytics and action. To achieve the greatest impact in our profession, we must work hard to ensure that analytical work leads to action. We must create buy-in for our work, and make each recommendation as actionable as possible.

I have observed several strategies that can help to ensure impact from analytical work. Here are three of them.

1. Start with the process.

Analytics work does not always require process changes, but often it does. When an analyst designs a model to inform a repeated decision, for example, he must think about how the decision-making process will need to change to take advantage of the analytical solution.

Improving performance through analytics often requires careful integration of the analytics solution into business processes. We have seen countless analytics efforts fail after months of effort are spent developing some beautiful model because nobody took the time up front to answer questions such as:

  • Who is going to use the model?
  • What benefit will the user(s) gain by using the model?
  • Do the user(s) have time to use the model?
  • Who is going to maintain the model?
  • Where are we going to get the data needed to feed the model?
  • How will the data get into the model?
  • Do the model outputs answer the right questions?
  • Can the model outputs be explained easily? Who will explain them?

A more effective approach involves starting with key process changes first and then adding analytical sophistication over time as needed and at the pace it can be absorbed.

As an example, my team recently developed a revenue management solution for a client whose organization had little prior experience with revenue management. We began by working with the client’s management team to define how a revenue management approach could work for their industry and business, and laid out a roadmap for how to put it together.

The initial solution used relatively simple analytics, mainly focused on providing visibility to critical pricing data that had not previously been readily available, with a recommendation algorithm based on a simple heuristic. We spent a significant portion of our time setting up the infrastructure for their organization to use the solution, including:

  • Codifying the approach into a reusable model and working with the client’s IT department to automate data feeds into the model.
  • Providing training to the client’s users. In our case this involved running more than 10 in-person training sessions for user groups across the U.S. The goal was as much to convince them of the value in changing their process as it was to train them on the specifics. It was time well spent to ensure that the solution was adopted.
  • Clearly defining roles, responsibilities, metrics and incentives for users and management involved in the process.
  • Clarifying the product definition structure to take better advantage of revenue management.
  • Staging the rollout across several test markets so that we could tweak the process over time.

At present, the client has a system up and running with more than 500 users across the U.S. accessing and deriving benefit from it daily. And since the process infrastructure was laid first, we are now able to layer on additional analytical sophistication into that infrastructure. I suspect that our effort could not have been nearly as successfully adopted if we had not put as much energy into process development and deployment as we did.

2.  Lift up the hood.

We find we are most successful at moving clients to action when we provide analytics that give them confidence in their decisions. This often means going to great lengths to achieve analytical transparency – making it clear what assumptions are made, what the data show and how the result is derived. Beware the black box. If our goal is to inform action, our analysis should enable us to explain why an action is appropriate, and it should do so in terms that decision-makers and stakeholders can understand.

My favorite way to achieve this type of transparency is through pictures (i.e., charts and graphs). Models can build insight by showing graphically how their outcome would change under alternate decisions or under a range of uncertain scenarios. Charts or graphs can also  provide insight into how a model outcome is derived in the first place [3].

When we build forecasting applications, for example, the forecast is typically only a small portion of the total information set provided as model outputs. We may include a graphical view of input data and assumptions, as well as intermediate outputs showing each step of the calculation for any product, demand segment, etc. When appropriate, we’ll include ranges and show how input uncertainties drive output uncertainties. Our goal is to build models that explain themselves to users – in many ways the opposite of a black box.

3. Highlight the “so what’s.”

It’s hard to close the gap between analytics and action if your analytics don’t make it clear what action you want the user to take, or what conclusion you wish your user to draw from the information they’re seeing. Analytical applications or reports that provide information, but do not provide guidance on how to interpret the information, are focused on the “what” but not the “so what.” Take for instance a reporting tool that lists inventory levels on all products in a warehouse at the end of each day. That’s the “what.” Layer into the report an extra line that indicates which products need to be reordered, and now you have the “so what” – an output that is much more actionable.

Some examples of highlighting the “so what’s” in data that have worked well for me and my colleagues include:

  • Superimposing a “target line” on a bar or scatter graph to illustrate which items have met a certain performance objective.
  • Highlighting metrics in a dashboard when their values are outside of a designated control window.
  • Using conditional formats or sparklines in data tables to highlight trends.
  • Simply drawing a box around recommendations or items requiring special attention.

It can take some creativity to come up with the best way to draw a user’s attention to the “so what’s” from an analysis or model, but often this can make the difference between analytics that focus users on value-creating actions and those that simply create information overload.

I hope these recommendations serve as a good reminder of what many consider to be a common-sense approach to deploying analytics. The challenge for many of us who love analytical work is that it can be easy to get lost in the excitement of looking at data or building a model and lose track of the critical importance of enabling decisions and actions from our work. Closing the gap between analytics and action can lead to greater impact for organizations, as well as a more rewarding experience for the practitioner.

Chris Fry ( is managing director of Strategic Management Solutions, a Redwood Shores, Calif.- based business analytics consulting firm specializing in pricing, forecasting, supply chain strategy, and product complexity management. He was awarded the 2009 Franz Edelman Award for Outstanding Achievement in the Practice of Operations Research and Management Science by INFORMS for his contributions to product portfolio management initiatives at Hewlett-Packard.


  1. A number of recent publications have documented the rise of business analytics. See for example: Davenport, Thomas H. and Jeanne G. Harris, 2007, “Competing on Analytics: The New Science of Winning,” Harvard Business School Press.
  2. See this year’s Franz Edelman Award special issue of Interfaces for examples of world-class, high-impact analytics projects: Interfaces, Vol. 41, No. 1, January-February 2011.
  3. For example screen shots of models designed to give a look under the hood, see T. Olavson and C. Fry, “Spreadsheet Decision Support Tools:  Lessons Learned at Hewlett-Packard,” Interfaces, Vol. 38, No. 4, July-August 2008.

business analytics news and articles


Forbes names Dietrich a leader of the data analytics pack

INFORMS member Brenda L. Dietrich, an IBM Fellow, vice president and leader of IBM’s data science group, was recently profiled by Forbes in an article headlined, “Meet 9 Women Leading The Pack In Data Analytics.” Dietrich is also an INFORMS Fellow and a member of the National Academy of Engineering. She served as president of INFORMS in 2007. Read more →

Female board members more likely to have tech experience

Female members of corporate boards of directors are nearly twice as likely as their male counterparts to have professional technology experience, according to new research from Accenture. To understand the gender composition of corporate boards and the role technology plays in the careers of female board members, Accenture examined women’s representation on the boards of more than 500 Forbes Global 2000 companies in 39 countries across Europe, Asia, North America, South America and Australia. Read more →

Study: Consumers buy close to what they first searched online

Given the ease of online search, consumers can explore and discover hundreds of available items in any category. Retailers and advertisers are keen to influence the search and final purchase through better product recommendations and targeted advertising. A forthcoming article in the INFORMS journal Marketing Science studies online search and purchase behavior of consumers in the digital camera category and finds that even though consumers may search for extended periods of time, what they purchase tends to be remarkably close to items they searched and found in their very first search. Read more →



INFORMS Annual Meeting
Nov. 13-16, 2016, Nashville, Tenn.


Foundations of Modern Predictive Analytics
Nov. 17-18, 2016, Nashville, Tenn. 


The Arithmetic of Uncertainty: A Cure for the Flaw of Averages
A one-day (9 a.m.-5 p.m.) workshop with optional hands-on computer lab
Sept. 14: Washington, D.C. (Crystal City)
Sept. 20: Houston
Sept. 21: Dallas
For more information and to register, click here.


CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:

For more information, go to