Share with your friends










Submit

Analytics Magazine

Software Survey: Decision Analysis

January/February 2013

Survey highlights new features and trends, but the relationship between technology and thoughtful analysis remains essential to the success of any software tool.

By William M. Patchak

Survey highlights new features and trends, but the relationship between technology and thoughtful analysis remains essential to the success of any software tool.

Two years have passed since OR/MS Today, the membership magazine of INFORMS (Institute for Operations Research and the Management Sciences) last surveyed the landscape of decision analysis software, a landscape continuing to evolve to meet the needs of decision analysts and the clients/decision makers they serve. When Don Buckshaw introduced the vendors featured in 2010’s survey, he observed that they were “producing a richer set of analysis tools and better visualization and analysis options” than when the first survey debuted in 1993 [1]. Dennis Buede, in his article introducing that inaugural survey, defined decision analysis as “the discipline of evaluating complex alternatives in the light of uncertainty, value preferences and risk preferences” [2]. Almost 20 years later, decision analysts have access to software products supporting a range of purposes, from the softer skills of structured value elicitation to high horsepower predictive analytics. Indeed, the results of this year’s survey confirm the growing diversity of tools available, each with increasing numbers of features to assist the decision analyst in an array of industries and problem sets.

The Survey

This year’s survey process followed those of previous years closely, with an online questionnaire sent to vendors based on either past survey participation or the authors’ awareness of existing software tools. As with previous surveys, vendors who did not receive the original questionnaire have the ability to add their information to the online survey results by contacting Patton McGinley at patton@lionhrtpub.com. And, as with previous surveys, results are provided verbatim from the responses sent by the vendors. Yet, the compiled results are not meant to separate the software tools in terms of quality or cost effectiveness – they are merely meant to catalogue existing products and to make potential users aware of new features and releases.

The responses this year featured 13 returning vendors from 2010, with 10 additional vendors submitting information as well. Two of the “newer” vendors are actually affiliated with products featured in 2010 – an indicator that some software ownership may have changed in the past two years. While the number of vendors responding actually decreased slightly from 2010 (from 24 to 23), individual software entries increased. Compared to the 36 products listed in the 2010 survey results, 47 products appear this year, with many vendors submitting multiple entries. Such an increase in product diversity from a small number of vendors may indicate that certain vendors are branching out from previous releases to offer a broader range of tools.

2012 Results

The product price range has expanded since 2010, with some versions of the software tools offered for free and certain commercial licenses reaching $49,000. Likewise, this year’s results feature additional international vendors from the United Kingdom and Belgium to complement returning vendors from the United States, New Zealand, Finland, Sweden and the U.K.

In terms of applications for the products, this year’s survey maintained the list from 2010: selection of a best option using multiple competing objectives, analysis of uncertainty, analysis of probabilistic dependencies, risk preference, sequential decision-making, portfolio decision-making and multiple stakeholders’ collaboration. The response this year also featured several software tools specifically citing their predictive analytic capabilities, such as ensemble decision trees and multivariate adaptive regression splines. And while Bayesian belief networks are a way to represent and analyze uncertainty, this year’s survey includes a specific option for vendors to select if they included Bayesian networks in their products. The result of this change was that, while almost all of the vendors said their software products could analyze uncertainty, only six responses explicitly listed a Bayesian network capability. Overall, and as expected, the featured vendors run the gamut in terms of application and functionality.

Another addition to this year’s survey was a catch-all free response question to capture additional features and methods not explicitly stated elsewhere in the survey. The diversity of featured software packages necessitated the question, and responses captured nuances in software capabilities ranging from the simulation of time series processes to group to surveys to methodologies for handling of missing data.

Another change for the 2012 survey was to include questions related to training available for each product. Following the 2010 survey, Buckshaw challenged vendors to “build in some form of coaching into their products so that even a novice can be confident that their models are producing sensible results” [1]. While recognizably different from built-in coaching, the training opportunities provided by either the vendor themselves or by a third party is one way of steering new users toward effective implementation of the software tools. As anticipated, the vast majority of this year’s survey respondents provide training for the tools themselves in a classroom environment. However, some training is also available via third party courses, and some is available online.

Beyond 2012

As it has in the past, this year’s survey serves as a helpful benchmark and snapshot in time for the decision analysis software community. Looking out beyond the current year, it’s easy to see certain trends continuing into the future. For example, Buckshaw observed in 2010 that “10 packages [were] web implementations, up from six packages two years ago [2008]. This trend will likely continue” [1]. Indeed, the 2012 survey lists 12 web implementations available, seven of which are returning packages from 2010. There’s very little reason to expect this trend to reverse.

According to Gartner’s Hype Cycle for Business Intelligence, 2011, collaborative decision-making (CDM) is just entering its “rise” in the business intelligence community, and many of the software tools highlighted in this year’s survey appear postured to ride the CDM wave. As Gartner terms it, the method and its applicable software packages have had their “technology trigger” and are now well on their way toward the “peak of inflated expectations” by focusing on bringing high level decision-makers together in a transparent way to facilitate a group decision and to capture the process and best practices [3]. In 2010, only 12 (33 percent) of the featured software products supported group elicitation and nine (25 percent) supported decentralized elicitation. In this year’s survey, 21 (45 percent) of the featured products have group elicitation capabilities with 17 (36 percent) supporting decentralized elicitation. Twenty-eight (60 percent) have multiple stakeholder collaboration applications. For those software vendors poised to support CDM, the increasing trends of Web integration could be extra beneficial in connecting disparate decision-makers.

However, while the technology itself continues to evolve, the need for thoughtful analysis remains essential to the success of any software tool. Increasing computational power and the expanding capabilities of “data warehouses” are now tempting more and more senior decision-makers to have their organizations enter the world of analytics. Notably, several new software tools for predictive analytics are featured on this year’s survey results. Yet, many analysts estimate that data cleanup accounts for 80 percent of the cost of data warehousing projects [4], and in my experience, these cleanup efforts are heavily reliant on human judgment.

Choices made on how to deal with data inconsistencies and data errors can make or break a decision model. In other words, no matter how automated an analytic software product may be, an analyst must make the right decisions on how to prepare the data. Even once the data is cleaned, the choice of algorithms and modeling methods remains as much an art as a science, and as much an act of refinement and revision as an act of fundamental principle.

The symbiotic relationship between decision analyst and software tools is tested further in certain government or defense industries where information security can trump the ability to leverage sophisticated modeling software. An analyst working in these industries must transfer his or her knowledge of a software’s underlying principles to whatever tool is available – a pen and paper if necessary – in order to complete the tasks required. On these occasions, software products have the opportunity to be more than “black boxes” outputting an answer. Rather, they can inspire analysts to approach decision analysis problems in new ways, perhaps with different techniques for visualization and elicitation. Instead of replacing a living, breathing analyst, well-designed software has the ability to impact business practices beyond the immediate application of a product.

Final Thoughts

Where will the decision analysis software community be in 2014 when the next survey is issued? If I were a financial analyst, I would start with the disclaimer that past performance is not an indicator of future results. But despite several entries from newcomers to the survey, the 2012 results saw many returning vendors, albeit with updated features and new tools. Many vendors continue to build on the fundamental underlying decision analysis principles and previous software releases to refine the user experience for decision analysts. For the next survey, I would expect to see a similar number of respondents, but perhaps with more Web implementations and more integration opportunities.

Buckshaw acknowledged in 2010 that the software “is a tool to support smart analysis, not to replace it [1],” and as the software increases in sophistication every year, challenges remain for the decision analysts using it. Data cleanup and innovative applications of the products and underlying principles continue to require careful thought and analysis. Technology has yet to replace the decision analyst, and a decision analyst cannot function to full effect without an ever-improving suite of software tools. There’s no indication this symbiotic relationship will change any time soon, but I look forward to seeing how it continues to evolve in the years to come.


William M. Patchak (wpatchak@innovativedecisions.com) is a decision analyst with Innovative Decisions, Inc., a management consulting firm serving business and government clients and specializing in the disciplines of decision and risk analysis, operations research and systems engineering. A version of this article appeared in the October 20012 issue of OR/MS Today.

REFERENCES

  1. Buckshaw, Don, “Decision Analysis Software Survey,” OR/MS Today, October 2010.
  2. Buede, Dennis, “Decision Analysis Software: Aiding the Development of Insight,” OR/MS Today, April 1993.
  3. Sallam, Rita L. and Andreas Bitterer, “Hype Cycle for Business Intelligence, 2011,” Gartner, Inc., August 2011.
  4. T. Dasu and T. Johnson, “Exploratory Data Mining and Data Cleaning,” John Wiley & Sons, Inc., 2003.

Survey Results & Directory

For results of the decision analysis software survey and a directory of decision analysis software vendors, click here.

business analytics news and articles

Analytics Blog

Electoral College put to the math test


With the campaign two months behind us and the inauguration of Donald Trump two days away, isn’t it time to put the 2016 U.S. presidential election to bed and focus on issues that have yet to be decided? Of course not.

Headlines

Stereotypes hold back girls’ interest in STEM subjects

New research from Accenture reveals that young people in the United Kingdom and Ireland are most likely to associate a career in science and technology with “doing research” (52 percent), “working in a laboratory” (47 percent) and “wearing a white coat” (33 percent). The study found that girls are more likely to make these stereotypical associations than boys. Read more →

Gartner: Connected ‘things’ will jump 31 percent in 2017

Gartner, Inc. forecasts that 8.4 billion connected things will be in use worldwide in 2017, up 31 percent from 2016, and will reach 20.4 billion by 2020. Total spending on endpoints and services will reach almost $2 trillion in 2017. Regionally, China, North America and Western Europe are driving the use of connected things, and the three regions together will represent 67 percent of the overall Internet of Things (IoT) installed base in 2017. Read more →

U.S. News: Analytics jobs rank among the best

When it comes to the best business jobs, analytics- and operations research-oriented disciplines dominate the list, according to U.S. News & World Report’s rankings of the “2017 Best Jobs.” In order, the top five “best business jobs” listings include: 1. statistician
, 2. mathematician
, 3. financial advisor, 
4. actuary, and 
5. operations research analyst. Read more →

UPCOMING ANALYTICS EVENTS

INFORMS-SPONSORED EVENTS

CONFERENCES

2017 INFORMS Business Analytics Conference
April 2-4, 2017, Las Vegas

2017 INFORMS Healthcare Conference
July 26-28, 2017, Rotterdam, the Netherlands

CAP® EXAM SCHEDULE

CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:


 
For more information, go to 
https://www.certifiedanalytics.org.