Share with your friends


Analytics Magazine

Decision Analysis Software Survey

Winter 2008


2008 survey of D.A. software targets a growing number of users

By Daniel T. Maxwell

Over the last two decades, the computational power and visualization capabilities supporting decision analysis software and have increased by many orders of magnitude. These advances have made decision analytic (and many other operations research) techniques accessible to a much broader community of users. In fact, we can see over the years that distribution numbers for many of the more mature software packages have grown from hundreds of likely specialized professional users a decade ago to many thousands of users world-wide today. This proliferation of software might lead one to believe that the professional decision analyst or even the operations research analyst might go the way of the blacksmith. Of course, this isn’t happening. Hard decisions don’t become easy because software and computers are more powerful.

Hard decisions usually possess one or more of the following characteristics [1]: complexity, uncertainty, multiple competing objectives and multiple stakeholders. Because of this, hard decisions will keep us in business! Hard decisions are all around us. A family’s decision to buy a used car has multiple objectives, uncertainty and likely multiple stakeholders. A high school senior’s college decision has competing objectives with uncertain outcomes. The list could go on for pages. While these types of decisions benefit from applying structured techniques, they are not the decision situations that cause decision-makers and senior managers to mobilize the operations research community. Operations research and decisions analysis professionals get involved when the stakes get extremely large. We get questions like, “What should an organization’s multi-billion dollar research portfolio contain? Or, “Where should a new nuclear power plant be located?” These are analysis challenges the operations research community gets mobilized to try and help understand. These are all messy decision situations with significant long-term consequences. These are challenges the operations research community really benefits from having decision analytic techniques and software tools in their “bag of tricks.”

The 2008 Survey

The 25 software packages identified in this survey represent a wonderful set of tools that is currently available to assist professionals in building decision analytic models and implementing decision analytic techniques. This collection of software packages provides a potpourri of capabilities designed to support all sorts of activities associated with the decision analytic process [2]. The data contained in the survey is vendor-provided and has not been further validated by either the author or the publisher. There are software packages matched to virtually all current operating systems as well Web application. The purchase cost of these packages varies widely, from freely available for academic use to many thousands of dollars for professional versions of the software. Additionally, practitioners should know that there are some packages available (not identified in the survey) for free through universities and other Web sources. All of that said, the cost of the software is usually only a small fraction of a major study or large decision analytic modeling effort’s cost. And, the perceived success or failure of a software package in an organization or study almost always correlates more closely to how well it is matched to the skill of the analyst, the decision situation and analysis environment than it does to cost. So, purchase price is rarely the best metric for a selection decision. Due diligence in selecting a package should include thinking about the following critical questions in addition to cost:

• Is there a single decision-maker or multiple decision-makers (stakeholders)?
• Will the decision-maker(s) participate in the decision analysis (decision conference), or will they be periodically presented results (dialog decision process)?
• Is the decision situation a choice of one alternative or a portfolio of alternatives?
• Do stakeholders have multiple, conflicting objectives that must be considered?
• Is there significant uncertainty in the events that impact the decision outcomes?
• Is it a single decision or a decision strategy (a sequence of decisions over time)?

Let’s look at each of these questions in a little more detail.

The number of decision-makers involved in the process influences how analysts can facilitate the communication that is necessary for structuring a decision model. If it is an individual decision-maker, then notebook computer-based tools that are designed for one on one communication or individual use are ideal. As the group size grows, software tools that support collective brainstorming and communication become increasingly important for problem structuring. Some packages, like “1,000 Minds” and “Opinions-Online” are Web-enabled and support communication among members of geographically dispersed groups. Once the problem is structured, some tools allow for distributed collection of input on value functions and weights. The ideal situation is to have a couple of different (but complementary) elicitation approaches available that allow the analytic team to check the input they are receiving for logical consistency.

How the decision-maker(s) will participate in the analysis is also critical to tool selection. If the modeling and analysis are to be accomplished in a conference setting, then it is critical that the tools be visually intuitive and easily (quickly) modified in response to discussion. For example, Figure 1a is an example of a probability wheel that can be used to support elicitation of individual probabilities and Figure 1b is a view of an interface that allows the analyst to simultaneously see a broad set of conditional probabilities. Features like this can be found in a number of packages including DPL.

Figure 1a: Probability wheel elicitation screen.

Figure 1b: Full conditional probability elicitation screen.

Figure 1b: Full conditional probability elicitation screen.

If the decision-maker is to only be episodically involved in the analysis process (often the case in large long-term efforts), then the team may benefit from tools that provide them more modeling flexibility. For example, NETICA has a feature that supports the construction of conditional probability distributions from equations. This allows an analyst to efficiently populate large conditional probability tables using equations. This can be very helpful in models that have as many as hundreds of thousands of conditional probabilities.

Role in Model Development

Another consideration that relates to decision-maker participation is the role the decision analytic software will play in the model development process. For example, Logical Decisions (LDW) can be used to interactively develop utility curves and weights. It also guides the analyst and group through the process. Figure 2 shows an example of an interactive elicitation screen for a single attribute utility function available in LDW. Conversely, HIVIEW uses a different approach and acts as a “bookkeeper” for data elicited in a decision conference. This approach provides the facilitator maximum flexibility for interacting with the group to develop a model and consensus on a course of action. The tool plays a supporting role by helping to ensure that all of the necessary model data is collected without inhibiting the facilitator’s ability work in their own or the group’s comfort zone.

Figure 2: Single attribute utility function elicitation screen.

Figure 2: Single attribute utility function elicitation screen.

Identifying whether the decision situation is a choice among alternatives or a portfolio containing multiple alternatives is obviously also a critical factor in tool selection. Packages that support portfolios use varying solution approaches for example, Equity3 implements a greedy cost-benefit algorithm that provides the user an “order of buy.” It is an easy-to-explain, high-valued solution, but not necessarily optimal. This can be a very useful approach, particularly in an interactive group setting. Other packages, like Logical Decisions Portfolio, apply mixed integer programming as the solution algorithm. Some analysts may already have access to and be familiar with optimization software. These analysts may prefer to use those tools to accomplish the optimization and limit the role of the decision analysis tools to the development and elicitation of the objective function [3]. There have been some very intuitive results displays created for portfolio tools in recent years. Figure 3 is an example of an intuitive style of portfolio display that is available in both Logical Decisions and Equity3.

Figure 3: Sample portfolio display.

Figure 3: Sample portfolio display.

Most hard decision situations require decision-maker(s) to make trades among a complicated set of competing objectives. There are a number of multi-criteria decision techniques implemented in the available software. Multi-Attribute Utility Theory (MAUT) and the Analytic Hierarchy Process (AHP) are the most prevalent and are identified explicitly in the survey. Most of the packages indicate that they implement MAUT. One package, DEA Solver, indicates it implements AHP, and Logical Decisions indicates that both MAUT and AHP are supported. In addition to these approaches, ordinal ranking techniques like SMARTER [4] are available in some of the software packages and can be quickly implemented to develop a first order set of weights for a decision model. The quick technique just might be good enough to meet the analysis goals.

Modeling Philosophy

More important than the software, it is critical that analysts understand that these techniques have different underlying axioms and different philosophies about how decision models should formulated. Both approaches have strengths, weaknesses and limitations that deserve some research before they are applied in practice. Whichever technique is applied, it is important that analysts ensure that both the relative importance of attributes and the range each attribute varies be clearly presented to the stakeholder for consideration as an integral part of the elicitation process. Considering only importance increases the risk that the model will produce unreliable results.

Uncertainty is also almost always a factor when making hard decisions. How it is addressed varies among the packages. How it is best addressed depends on the nature of the uncertainty, how the model is being developed, the data that is available and the resources that are available for model development. We saw earlier (Figure 1) tools that are available for eliciting probability judgments from experts. Often, these judgments are placed in a decision analytic model called an influence diagram. These diagrams are designed to combine an intuitive, visual presentation of the relationships among the variables with a sound underlying mathematical representation of their joint probability distribution. NETICA, DPL and Analytica are all packages that implement influence diagrams. These packages are extremely valuable for representing very complicated combinations of dependencies among variables, especially when interacting directly with decision makers.

In the early days of influence diagrams and other decision analytic models that considered uncertainty explicitly, solution time and computer memory for models were very important considerations. As an example, one influence diagram model the author developed in the very early 1990s possessing almost 2 million solution paths took approximately two hours to solve. Today, the same model using newer versions of the influence diagram software and a current notebook computer solves in less than three minutes. This power allows us to represent and solve increasingly complicated problems. It also allows analysts to exercise the models we develop more rigorously.

In some cases the analytic team might have large quantities of data that can be used to inform the probability model. Some of these packages (as well as some statistical packages) have learning algorithms that will build the joint probability distribution from the available data. If this option is available, the analysis team should be certain to supplement the automated effort and involve subject matter experts in the review of the resulting model. The experts can help find errors in the data and, just as importantly, they can supplement the model with knowledge they possess. Combining what is learned from data from what is learned from experts usually yields a better model and results in a higher likelihood that the effort will be successful.

A final consideration for software selection is identifying if the model is considering a single decision or decision strategy. Virtually all of the packages will consider a single decision. The influence diagram packages and some of the Monte Carlo packages will also support considering multiple decisions that might unfold over time. In influence diagrams, this situation is represented as a sequence of decisions, likely with uncertainties that will resolve over time spaced in between decisions. Sometimes a hard decision actually consists of a set of smaller decisions that either occur over time or can be thought of as a package. A technique for representing this type of situation that virtually all of the software packages will support is to use a decision strategy or alternative generation table [5].


The information provided in this article and the survey in intended to help analysts select a toolset that fits the specific challenge they face or maybe even a general-purpose package for their toolkit. When shopping for decision analysis software, focus on the potential tool’s ability to fit the specific problem or class of problems you face. Evaluate the software in relation to the situational factors that are relevant to you. If your goal is to add a package or two to your general tool kit, then a package or combination of packages that provide balanced support across the spectrum of situations and the entire decision analysis process is likely your best investment. If the types of models you wish to employ involve multiple stakeholders and multiple competing attributes, then tools that emphasize group support and value elicitation are worth exploring. Problems involving large uncertainties, diagnosis, complex interdependencies or risk analysis would benefit most from tools like influence diagrams, Bayesian networks or one of the Monte Carlo modeling tools. Whichever tool(s) you select, they should be intuitive to the user, explainable to the client and support easy iteration among the various stages of the decision analysis process.

Dan Maxwell ( is a senior principal analyst at Innovative Decisions, Inc. of Centreville, Va.


1. See “Making Hard Decisions: An Introduction to Decisions Analysis,” by Robert Clemen or “Value Focused: A Path to Creative Decisions Making,” by Ralph Keeney for a more complete discussion of decision analysis principles and practical techniques.

2. For a description of the decision analysis process see “Improving Hard Decisions,” OR/MS Today, 2006, Vol. 33. No 6.

3. See “Strategic Decision Making,” Craig. W. Kirkwood (1997) for a discussion of considerations in resource allocation.

4. See Edwards & Hutton, “SMARTS and SMARTER: Improved Simple Methods for Multi-attribute Utility Measurement.” Organizational Behavior and Human Decision Processes, Vol. 60, No. 3, December 1994, pp. 306-325

5. See Parnell, G. S., Chapter 19, “Value-Focused Thinking, Methods for Conducting Military Operational Analysis,” Military Operations Research Society, editors Larry Rainey and Andrew Loerch, 2007, pp. 619-656, for a concise discussion of alternative generation tables.

To view the directory of DA products along with the survey data, see


Analytics Blog

Electoral College put to the math test

With the campaign two months behind us and the inauguration of Donald Trump two days away, isn’t it time to put the 2016 U.S. presidential election to bed and focus on issues that have yet to be decided? Of course not.



Gartner: AI technologies to be pervasive in new software products

Market hype and growing interest in artificial intelligence (AI) are pushing established software vendors to introduce AI into their product strategy, creating considerable confusion in the process, according to Gartner, Inc. Analysts predict that by 2020, AI technologies will be virtually pervasive in almost every new software product and service. Read more →

Drone delivery: Professor develops solution to minimize delays in operations

When delivery companies like FedEx, Amazon and UPS launch drones to deliver packages in the near future, one Kennesaw State University computer science professor may be at the crux of solving one of its most complicated problems. Donghyun (David) Kim, assistant professor of computer science and an expert in computer algorithm optimization, is designing a fast-running algorithm to tackle simultaneous coordination problems among multiple delivery trucks and the drones launched from them. Read more →

Tech spending growth limited to about 5 percent through 2018

Forrester predicts U.S. business and government tech spending will continue to grow by 4.8 percent through 2017 and increase to 5.2 percent in 2018. While these forecasts are higher than Forrester’s projections following the 2016 presidential election, they are lower than the expected numbers from a year ago. Read more →



Essential Practice Skills for High-Impact Analytics Projects
Sept. 26-27, Executive Conference Center, Arlington, Va.

Foundations of Modern Predictive Analytics
Oct. 2-3, VT Executive Briefing Center, Arlington, Va.

2017 INFORMS Annual Meeting
October 22-25, 2017, Houston

2017 Winter Simulation Conference (WSC 2017)
Dec. 3-6, 2017, Las Vegas


CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:

For more information, go to