Share with your friends










Submit

Analytics Magazine

Viewpoint: Not taught in school (but useful in the ‘real world’)

September/October 2015

All models are wrong, but some are useful. – George Box

Scott Nestler Sam Huddleston

By Scott Nestler, CAP, and Sam Huddleston (l-r)

Someone recently commented (and asked), “It’s a great time to be a quant grad, but what didn’t they teach you in school that you really need to know?” The first three things that occurred to us are:

  1. The need to fully embrace the second half of the famous quote from George Box [1].
  2. How to communicate the results of your analysis to decision-makers.
  3. The importance of creating a good visualization of your data/model (a subset of the previous task).

With regard to models, one of the most important keys “in the real world” is understanding the difference between a great-fitting model and a useful model and the need to structure your research and modeling toward improving the decision-maker’s ability to make decisions rather than simply getting a high R2 value. In academic environments, students are often introduced to a series of methods and then provided data sets with which to practice. The goal is almost always to develop the most accurate model possible from the available methods. In the real world, it is often more important to focus on developing models that are useful and then to improve the accuracy of the models over time under the constraint that their utility isn’t compromised. Examining the use of blackjack card-counting systems provides a good example of how model utility rather than performance is important in the real world.

Blackjack Example

Blackjack counting systems are not very “accurate” in the sense that even when the “deck is hot” (when the odds have swung in favor of the players vs. the dealer) there is still a very high probability that the dealer will win a given hand (and the bettor will lose his money to the house). These systems aren’t great predictors of the outcome of individual hands, as the model is often wrong. However, card counts are useful in that they can be employed in a systematic manner to win money, as documented in numerous recent popular books and movies. These counting systems (models) provide information that is useful for making decisions about what size bet to make, given the current state of the system. While many different card-counting systems have been developed, the most useful of these card-counting systems are relatively simple because more complex card counting systems (which may be more accurate) are almost impossible to employ effectively for betting decisions in a chaotic casino environment. This is just one example of how, in the real world, there is often a trade-off space between model accuracy and model utility, with decision-making utility carrying more weight.

Figure 1: The authors recently saw an attempt to explain the terms precision and accuracy, and the difference between systematic and reproducibility errors. In lieu of three paragraphs of text, they suggest the graphic shown here.
Figure 1: The authors recently saw an attempt to explain the terms precision and accuracy, and the difference between systematic and reproducibility errors. In lieu of three paragraphs of text, they suggest the graphic shown here. Source: http://www.sophia.org/tutorials/accuracy-and-precision

The point about communication skills was hammered home when the first author was teaching at the Naval Postgraduate School in Monterey, Calif. In the biennial (every two years) program review, the number one comment was that the graduates came with all of the technical skills that they needed, but their ability to communicate to senior leaders and decision-makers, whose time is limited and span of involvement is high, was lacking.

One common mistake made by many analysts is a failure to make a distinction between a technical report or presentation and an executive summary or decision brief. A technical report is a document written to record how an analysis was done (so that it can be replicated) and is designed to make a scientific and logical argument to support a set of conclusions. Therefore, a common outline for such a report might be: introduction, literature review, problem definition, methodology, results and conclusions. A presentation to a decision-maker using this format is likely to produce impatience and frustration – “Just get to the bottom line.”

Executive Summary & Decision Brief

The structure of a good executive summary or decision brief relies on the logical argument of the technical report (and should only be written once this logic is firmly established) but presents the logic in reverse order. An executive summary should lead with a brief statement of purpose to orient the reader, and then summarize the conclusions and recommendations, i.e., the bottom line up front, the results of the analysis (preferably in an easy-to-read chart), and briefly highlight the methodology and data used. One way to highlight the distinction between the logic of a technical report and an executive summary is that the logic of a technical report can be summarized with a series of “Therefore . . .” statements, while the logic of an executive summary should rely on a series of “Because . . .” statements.

These ideas appeared in a blog post [2] by Polly Mitchell-Guthrie, chair of the INFORMS Analytics Certification Board (ACB), which oversees the Certified Analytics Professional (CAP®) [3] program, in 2013. She writes, “Much as we lament the shortage of graduates from the STEM disciplines (science, technology, engineering and math), it is arguably more difficult to find within that pool graduates who also have the right ‘soft skills.’” Polly points out that “selling” – yourself and your skills as an analyst – to convince others that you can solve their problems and improve their decision-making is critical. She suggests Daniel Pink’s book, “To Sell Is Human: The Surprising Truth About Moving Others” [4]. While “hard math” is critical in many instances, convincing someone that you have the technical skills to solve their problem is often more difficult. This is further highlighted in the seven domains of the CAP Job Task Analysis: business problem framing, analytics problem framing, data, methodology selection, model building, deployment and lifecycle management. Not surprisingly, many of the supporting 36 tasks and 16 knowledge statements involve communications skills.

These shortcomings among analysts are nothing new. In 2011, an Analytics magazine article [5] by Freeman Marvin, CAP, and Bill Klimack highlighted six “soft” skills every analyst needs to know: partnering with clients, working with teams, problem framing, interviewing experts, collecting data from groups and communicating results. Failure to effectively communicate results can lead to a project that is a technical success but has no impact. They propose that instead of dragging the decision-maker through the entire chronology of an analysis, tell a compelling story with a beginning, middle and end.

One of the best ways to tell a compelling story is to use pictures (or graphics) to communicate the results of an analysis. Unfortunately, methods and principles for visually communicating the results of an analysis are often not taught in technical programs even though, as Mike Driscoll asserts in a popular online presentation [6], the ability to “munge, model and visually communicate data” are “the three core skills of data geeks.” Reviewing the work of Edward Tufte [7] and William S. Cleveland [8] provides an excellent foundation for visually communicating quantitative information. “Choosing a Good Chart” [9] by Abela is also useful, as it suggests an appropriate type of graphic for nearly any type of data and purpose.

Summing Up

In summary, first focus on developing useful models. Second, when communicating with decision-makers, start by describing the utility of those models – how can they be used and what difference will it make. Only after communicating the practical effects of the employment of the model/analysis should you communicate how you arrived at your conclusions (follow the logic of the technical report backwards). Finally, the most compelling way to communicate these ideas is through developing graphical products that clearly communicate the key results of your analysis. As they say, “A picture is worth a thousand words.”


Scott Nestler (snestler@nd.edu), Ph.D., CAP, is an associate professional specialist in the Department of Management, Mendoza College of Business, University of Notre Dame, and a longtime member of INFORMS.  Sam Huddleston (shh4m@virginia.edu), Ph.D., is an operations research analyst in the U.S. Army.

Disclaimer: The views expressed in this article are those of the authors and do not necessarily reflect the official policy or position of the Army, the Department of Defense or the U.S. government.

REFERENCES

  1. https://en.wikiquote.org/wiki/George_E._P._Box
  2. http://blogs.sas.com/content/subconsciousmusings/2013/02/01/why-soft-skills-are-so-important-in-analytics-and-how-to-learn-them/
  3. www.certifiedanalytics.org
  4. http://www.danpink.com/books/to-sell-is-human/
  5. http://www.analytics-magazine.org/january-february-2011/76-special-conference-section-people-to-people.html
  6. http://igniteshow.com/videos/mike-driscoll-three-sexy-skills-data-geeks
  7. http://www.edwardtufte.com/tufte/
  8. http://www.stat.purdue.edu/~wsc/
  9. http://extremepresentation.typepad.com/blog/2015/01/announcing-the-slide-chooser.html

 

business analytics news and articles

Analytics Blog

Electoral College put to the math test


With the campaign two months behind us and the inauguration of Donald Trump two days away, isn’t it time to put the 2016 U.S. presidential election to bed and focus on issues that have yet to be decided? Of course not.

Headlines

Holiday Retirement earns Edelman Award

Holiday Retirement won the 2017 Franz Edelman Award for Achievement in Operations Research and the Management Sciences from INFORMS for its use of operations research (O.R.) to improve the pricing model for its more than 300 senior living communities across the United States. Read more →

U.S. Air Force and Disney receive the 2017 INFORMS Prize

The U.S. Air Force and The Walt Disney Company both received the 2017 INFORMS Prize for their pioneering and enduring integration of operations research (O.R.) and analytics programs into their organizations. The prizes were presented at the INFORMS Conference on Business Analytics & Operations Research in April in Las Vegas. Read more →

Air Force Academy’s O.R. program saluted

The U.S. Air Force Academy won the 2017 UPS George D. Smith Prize for its operations research (O.R.) program, which prepares graduates to become frontline O.R. practitioners as analysts in the Air Force. The program exposes more than 50 percent of cadets to at least one O.R. course and provides cadets the opportunity to graduate with a Bachelor of Science degree in O.R. Read more →

UPCOMING ANALYTICS EVENTS

INFORMS-SPONSORED EVENTS

CONFERENCES

2017 INFORMS Healthcare Conference
July 26-28, 2017, Rotterdam, the Netherlands

CAP® EXAM SCHEDULE

CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:


 
For more information, go to 
https://www.certifiedanalytics.org.