Executive Edge: Closing the gap between analytics and action
By Chris Fry
Analytics is on the minds of executives more than ever before. Data are becoming more accessible, computing power is increasing to the point where we can actually crunch those data, and analytical software tools are becoming increasingly powerful and cost-effective as well .
With such an ideal environment, analytics should be having a massive impact on business performance. Many analytics efforts do have great impact , but too many seem to never reach their full potential. How many times have you heard the story of the analyst who comes up with a great concept or model, but the affected decision-makers have never heard anything about it, cannot understand it or are unable to “operationalize” it?
In my view, such efforts do not reach their full potential because the analysts and project managers haven’t closed the gap between analytics and action. To achieve the greatest impact in our profession, we must work hard to ensure that analytical work leads to action. We must create buy-in for our work, and make each recommendation as actionable as possible.
I have observed several strategies that can help to ensure impact from analytical work. Here are three of them.
1. Start with the process.
Analytics work does not always require process changes, but often it does. When an analyst designs a model to inform a repeated decision, for example, he must think about how the decision-making process will need to change to take advantage of the analytical solution.
Improving performance through analytics often requires careful integration of the analytics solution into business processes. We have seen countless analytics efforts fail after months of effort are spent developing some beautiful model because nobody took the time up front to answer questions such as:
- Who is going to use the model?
- What benefit will the user(s) gain by using the model?
- Do the user(s) have time to use the model?
- Who is going to maintain the model?
- Where are we going to get the data needed to feed the model?
- How will the data get into the model?
- Do the model outputs answer the right questions?
- Can the model outputs be explained easily? Who will explain them?
A more effective approach involves starting with key process changes first and then adding analytical sophistication over time as needed and at the pace it can be absorbed.
As an example, my team recently developed a revenue management solution for a client whose organization had little prior experience with revenue management. We began by working with the client’s management team to define how a revenue management approach could work for their industry and business, and laid out a roadmap for how to put it together.
The initial solution used relatively simple analytics, mainly focused on providing visibility to critical pricing data that had not previously been readily available, with a recommendation algorithm based on a simple heuristic. We spent a significant portion of our time setting up the infrastructure for their organization to use the solution, including:
- Codifying the approach into a reusable model and working with the client’s IT department to automate data feeds into the model.
- Providing training to the client’s users. In our case this involved running more than 10 in-person training sessions for user groups across the U.S. The goal was as much to convince them of the value in changing their process as it was to train them on the specifics. It was time well spent to ensure that the solution was adopted.
- Clearly defining roles, responsibilities, metrics and incentives for users and management involved in the process.
- Clarifying the product definition structure to take better advantage of revenue management.
- Staging the rollout across several test markets so that we could tweak the process over time.
At present, the client has a system up and running with more than 500 users across the U.S. accessing and deriving benefit from it daily. And since the process infrastructure was laid first, we are now able to layer on additional analytical sophistication into that infrastructure. I suspect that our effort could not have been nearly as successfully adopted if we had not put as much energy into process development and deployment as we did.
2. Lift up the hood.
We find we are most successful at moving clients to action when we provide analytics that give them confidence in their decisions. This often means going to great lengths to achieve analytical transparency – making it clear what assumptions are made, what the data show and how the result is derived. Beware the black box. If our goal is to inform action, our analysis should enable us to explain why an action is appropriate, and it should do so in terms that decision-makers and stakeholders can understand.
My favorite way to achieve this type of transparency is through pictures (i.e., charts and graphs). Models can build insight by showing graphically how their outcome would change under alternate decisions or under a range of uncertain scenarios. Charts or graphs can also provide insight into how a model outcome is derived in the first place .
When we build forecasting applications, for example, the forecast is typically only a small portion of the total information set provided as model outputs. We may include a graphical view of input data and assumptions, as well as intermediate outputs showing each step of the calculation for any product, demand segment, etc. When appropriate, we’ll include ranges and show how input uncertainties drive output uncertainties. Our goal is to build models that explain themselves to users – in many ways the opposite of a black box.
3. Highlight the “so what’s.”
It’s hard to close the gap between analytics and action if your analytics don’t make it clear what action you want the user to take, or what conclusion you wish your user to draw from the information they’re seeing. Analytical applications or reports that provide information, but do not provide guidance on how to interpret the information, are focused on the “what” but not the “so what.” Take for instance a reporting tool that lists inventory levels on all products in a warehouse at the end of each day. That’s the “what.” Layer into the report an extra line that indicates which products need to be reordered, and now you have the “so what” – an output that is much more actionable.
Some examples of highlighting the “so what’s” in data that have worked well for me and my colleagues include:
- Superimposing a “target line” on a bar or scatter graph to illustrate which items have met a certain performance objective.
- Highlighting metrics in a dashboard when their values are outside of a designated control window.
- Using conditional formats or sparklines in data tables to highlight trends.
- Simply drawing a box around recommendations or items requiring special attention.
It can take some creativity to come up with the best way to draw a user’s attention to the “so what’s” from an analysis or model, but often this can make the difference between analytics that focus users on value-creating actions and those that simply create information overload.
I hope these recommendations serve as a good reminder of what many consider to be a common-sense approach to deploying analytics. The challenge for many of us who love analytical work is that it can be easy to get lost in the excitement of looking at data or building a model and lose track of the critical importance of enabling decisions and actions from our work. Closing the gap between analytics and action can lead to greater impact for organizations, as well as a more rewarding experience for the practitioner.
Chris Fry (firstname.lastname@example.org) is managing director of Strategic Management Solutions, a Redwood Shores, Calif.- based business analytics consulting firm specializing in pricing, forecasting, supply chain strategy, and product complexity management. He was awarded the 2009 Franz Edelman Award for Outstanding Achievement in the Practice of Operations Research and Management Science by INFORMS for his contributions to product portfolio management initiatives at Hewlett-Packard.
REFERENCES, NOTES & FURTHER READING
- A number of recent publications have documented the rise of business analytics. See for example: Davenport, Thomas H. and Jeanne G. Harris, 2007, “Competing on Analytics: The New Science of Winning,” Harvard Business School Press.
- See this year’s Franz Edelman Award special issue of Interfaces for examples of world-class, high-impact analytics projects: Interfaces, Vol. 41, No. 1, January-February 2011.
- For example screen shots of models designed to give a look under the hood, see T. Olavson and C. Fry, “Spreadsheet Decision Support Tools: Lessons Learned at Hewlett-Packard,” Interfaces, Vol. 38, No. 4, July-August 2008.