Share with your friends


Analytics Magazine

Business Intelligence & Modeling Systems Synergy

Fall 2009


Integrating often-competing methodologies helps analyze and solve particularly tough problems.

By Jeremy F. Shapiro

The information revolution has reached a stage where business managers are seeking to extract intelligence from their large and growing transactional databases. After a 40-year nascent period, the revolution took off in the 1980s with the introduction of PCs and supporting software that put computing in the hands of much wider and diversified populations of users. The acceleration continued in the 1990s with significant advances in Internet technologies and the introduction of enterprise resource planning (ERP) systems and other transactional data management systems. Investment in the United States in computers, peripherals and software, measured in 1996 dollars, grew from $20 billion in 1983 to $430 billion in 2000 [1]. Except for mild slow-downs due to the bubble burst and the recent economic downturn, developments and investments in Internet technologies and ERP systems have continued to grow rapidly during the 2000s.

In 1999, Drucker [2] correctly pointed out that the information revolution up to that time had mainly produced software systems that “routinize” traditional business processes with tremendous savings in time and possibly cost. By contrast, the systems had little impact on the ways in which business decisions were made. More recently, managers have come to understand this limitation. As a result, the acquisition and application of business intelligence (BI) systems have grown significantly.

BI systems create data warehouses from which managers can extract key performance indicators (KPIs) and other metrics that track the performance of their companies. They have been successfully applied to business problems in sourcing & procurement, logistics, inventory management, quality & production, services performance, marketing & sales, asset management, cash management and other areas. To meet the growing interest, large IT companies such as SAP, IBM and Oracle have, in the past four years, acquired leading BI systems and invested heavily in expanding their capabilities.

Despite the recent applications, the ways in which BI systems can or should employ analytics in extracting intelligence from data have not yet been clearly stated and understood. For example, Erickson [3] defines BI as “… tools, technologies, and processes required to turn data into information and information into knowledge and plans that optimize business actions … It encompasses data integration, data warehousing, and reporting and analysis tools.” Natural questions are: In what sense (and how) are business actions optimized? What are the tools used to identify these actions? What role does a business decision-maker play in using these tools to identify his/her preferred or optimal actions?

Similarly, Davenport and Harris [4] state, “By analytics we mean the extensive use of data, statistical and quantitative analysis, explanatory and predictive models, and fact-based management to drive decisions and actions. … Analytics are a subset of what has come to be called business intelligence; a set of technologies and processes that use data to understand and analyze business performance.” Again, natural questions are: Does the collection of analytics have inherent structure that allows their systematic application to business decision problems or is their application more randomly positioned for specific problems? Do existing BI systems already employ analytics to fully evaluate decisions and actions or are there important, untapped opportunities for doing so?

Our goal here is to answer these and related questions. In so doing, we will discuss the differences and similarities between BI systems and modeling systems, which employ a range of descriptive and optimization models in analyzing focused decisions. We will also discuss ways to merge the two technologies, thereby providing enormous openings for deeper analysis by BI systems of business planning problems and wider application of modeling systems to such problems.

Of course, people and processes for exploiting BI systems and modeling systems are necessary for their successful application. Drucker [2] suggests that a critical requirement is the creation of new business entities comprised of knowledge professionals who assist managers with overlapping responsibilities in making intelligent, integrated decisions. These professionals must have advanced training in IT and analytics and an in-depth understanding of the company’s business. If they are employees of the company, they must be given sufficient organizational status and power to promote and sometimes even impose collaboration among senior managers in their decision-making.

BI System Schema

An effective BI system supports the needs of a variety of users — executives, managers, analysts, power users and casual business users — in an integrated and comprehensive manner [5]. Executives require information that is highly summarized and directly relevant to their key initiatives. They seek customized dashboards that provide an at-a-glance view of the performance of their companies. Managers require both a high-level view of information relevant to their jobs and the ability to drill down into details. Some managers and the analysts who work with them need to go beyond basic reporting to more complex analysis explaining trends and correlations in their data. Power users, especially financial analysts responsible for creating and distributing statement style reports to the rest of the organization, need flexible, professional authoring tools. Finally, casual users seek access to easy-touse information in a format that fits with their normal activities. They also need to leverage simple system capabilities in searching for relevant information.

The schema of a typical BI system is depicted in Figure 1 [6]. As shown, the BI system performs an integration and mapping of diverse data sources presented in a variety of formats. Specifically, the system employs processes that extract, transform and load (ETL) these data for storage in homogeneous formats in the data warehouse. It also checks data quality, concentrating on those data elements that will be most frequently incorporated in reports, dashboards, tables and other outputs. A data directory for the warehouse is constructed along with one or more dictionaries of the data to be accessed by specific reporting or analysis tools. The data warehouse should retrieve and store unaggregated data drawn from the data sources; aggregations may be performed for reporting purposes but the warehouse needs to retain data in its primitive form to avoid ambiguities and inaccuracies.

Figure 1: Business intelligence system schema.Figure 1: Business intelligence system schema.

Given the quantity and variety of data that is likely to be stored in the data warehouse, its design and implementation is a challenging task. Moreover, the scope of BI support has been expanding to include vendors and customers of the company who need to access data relevant to their responsibilities and interests. Online analytical processing (OLAP), database partitioning, multi-dimensional clustering, and other tools for managing, extracting, combining and projecting data need to be very efficient. Although there is no universally accepted “best approach” for designing a data warehouse, current practice is often some combination of the dimensional approach and the normalized approach, which we will now explain [7].

With the dimensional approach, transaction data are divided into “facts” (which are usually numerical data) and “dimensions” (which provides context for the facts). As an example, the facts of a purchase transaction are the number of products ordered and the prices paid, while the dimensions are the purchase and delivery dates, vendor name, the person who placed the order, product number, and the ship-to and pay-to locations. The dimensional approach is easy for users to understand and it promotes rapid data retrieval. It has the disadvantage in that it is complicated to implement when several operating systems are being integrated. In addition, the approach is difficult to modify if and when the business changes the ways in which it operates.

With the normalized approach, transaction data are stored in the warehouse using database normalization rules. Tables are grouped by subject areas, which make it easy to add information to the warehouse. However, due to the number of tables involved, difficulties can arise in combining data from different sources and accessing information without a clear understanding of the data groupings.

Tools to create dashboards, reports, e-mail alerts and other representations of business intelligence are obviously critical to the successful use of a BI system. The user expects to easily transition from one output representation to another. Many users will also seek to create ad hoc, customized reports that are supported by easy-to-use reporting tools.

Modeling System Schema

Over the years, a wide range of analytics, also known as mathematical models, have been successfully applied to business planning problems to assist managers in making intelligent decisions. These models use concepts and methods from statistics, econometrics, data mining, applied probability theory, queuing theory, decision analysis, mathematical programming, heuristics and other disciplines of applied mathematics. Modeling systems employ model generators based on these methods that are linked to software for accessing data inputs and writing reports. The vast majority of applications have used ad hoc approaches unconnected with BI systems for these data input/output functions.

Modeling system creation has typically followed an artistic approach whereby the model builders use their knowledge and intuition to construct and position the models. The term “decision support systems” reinforces this artistic orientation by implying that modeling systems are one-offs designed for each situation requiring decision support. Recently, though, off-the-shelf modeling systems for repeatable applications such as inventory optimization, supply chain network design, advertising response, sales force territory design and allocation, have emerged.

The schema of Figure 2 depicting a typical modeling system provides a foundation for modeling system construction and use. It also facilitates our discussion of the integration of BI systems and modeling systems presented below. The sequence we follow in discussing modeling system construction is merely one approach, and we do not suggest that it is necessary or even desirable to slavishly follow it for all applications. It highlights the complementary roles of descriptive and optimization models in extracting intelligence from data.

Figure 2: Modeling system schema.Figure 2: Modeling system schema.

The construction of a modeling system begins with a focused business planning problem requiring analysis. The manager responsible for the planning problem interacts with knowledge professionals in developing an optimization model that can provide intelligent solutions to the problem. Alternatively, the knowledge workers may suggest an off-the-shelf modeling system capable of optimizing the major decisions to be made.

Whether the manager and the knowledge professionals decide to construct a customized or acquire an off-the-shelf modeling system, the next step is to construct and validate the decision database providing inputs to the optimization model (see [8]) or a discussion of the decision database for supply chain planning). The structure and content of this database is determined by the optimization model. It will also store optimal solutions found by the model. As shown in Figure 2, the knowledge professionals develop ETL routines for processing transactional data fed directly to the decision database and for transactional data fed to descriptive models that in turn produce forecasts and projections stored in the decision database.

Unlike the data warehouse of a BI system, data in the decision database may reflect aggregations of products, manufacturing and distribution processes, markets, and suppliers. Such aggregations are necessary and desirable for optimization models addressing strategic and tactical decision-making. Data aggregation is also essential in demand forecasting, activity-based costing methods and other descriptive modeling applications. Flexible ETL routines in the modeling system are needed to allow aggregations to be modified as the situations analyzed by the system evolve.

The initial decision database for a specific application may be validated using a data set of a recent historical period from which an optimization model is constructed and solved.

Although the past cannot be optimized, the validation results are useful in convincing managers of the accuracy and benefits of data-driven models. The manager and the knowledge professionals working with him/her will then create multiple scenarios of the future, each of which are optimized, where the number and range of scenarios will depend on the planning horizon of the application — strategic, tactical or operational.

Typical outputs from a modeling system include maps, spreadsheets, tables and graphs summarizing the results of a single optimization run or comparing results across multiple scenario runs. The solution may also be subject to post-optimality analysis to compute, for example, disaggregated production details from an aggregate plan, or supply chain transfer costs derived from an optimal network design. For operational decision problems, reports based on the solution selected by the manager and/or the knowledge professionals will often be electronically forwarded to other managers who have the responsibility of implementing it.

The modeling system schema explicitly differentiates optimization models from descriptive models, but the two are mutually dependent. Results from an optimization model for airline crew scheduling depend on the accuracy of the model’s inputs regarding costs and crew availability. Similarly, the design and implementation by an industrial products company of an accurate demand forecasting model does not in and of itself indicate where, how and when the company’s products should be manufactured and distributed.

Given the enormous data sets created by Web users, as well as those generated by ERP and other transactional systems, the current tendency to focus on statistical and data mining analytics is understandable.

Still, knowledge professionals should also promote optimization models as important and necessary tools for improving business decision-making. This message is implicit in IBM’s naming of its new entity created in April as the Business Analytics and Optimization Services group.

Integration of BI and Modeling Systems

Some incorporation of analytics, especially descriptive models, into BI systems has already been achieved. As reviewed above, improved business de-cision-making entails creating and applying both descriptive and optimization models. We discuss here a method for integrating the BI system functionality depicted in Figure 1 with the modeling system functionality depicted in Figure 1 with the modeling system functionality depicted in Figure 2. The integration is motivated by managerial need to improve KPIs produced by the BI system in a focused area of decision-making.

Step 1Conceptualize a model to optimize decisions underlying inferior KPIs. The managers and the knowledge professionals responsible for improving an under-performing area of the company scope out decisions affecting the area along with relevant constraints and costs. The knowledge professionals draw up a model for optimizing these decisions.

Step 2Determine whether to build a customized modeling system or acquire an off-the-shelf modeling system. The knowledge professionals do a search of available off-the-shelf modeling systems to learn if one or more are appropriate to the decision problems scoped out in Step 1. The search may include an RFI sent to relevant software firms. The managers and the knowledge professionals then make a collective decision about the software option to pursue.

Step 3Define the decision database needed to generate the optimization model. Depending on the selection made in Step 2, the decision database is either defined by the off-the shelf modeling system or by the customized modeling system being implemented by the knowledge professionals.

Step 4Create the decision database by adapting the BI system’s ETL routines and designing and implementing necessary descriptive models. This will be the most time consuming task of the system integration. However, creation of an accurate and flexible decision database should be greatly facilitated by the application and possible adaptation of the BI system’s ETL routines.

Step 5Exercise the modeling system and link the decision database including outputs from the optimization model to the BI system’s reporting tools. The BI system’s reporting tools should provide managers with a wider and more insightful range of reports than those available from the typical modeling system.

Step 6aFor strategic and/or tactical decision-making, program the BI system to provide alerts for re-use of the imbedded modeling system; alternatively, the integrated systems may be exercised over a regular planning cycle to support integrated decision-making. If the inferior KPIs that instigated the modeling system integration into the BI system involve strategic and/or tactical decision-making, the BI system can be used to monitor the improved plans identified by the modeling system and to notify the managers when modeling analysis is needed to identify modified plans. Alternatively, a regular cycle such as one month for tactical planning or one year for strategic planning may be established for applying the modeling system.

Step 6bFor operational decision-making, the integrated systems become a tool used daily or on a real-time basis by dispatchers, schedulers and other operational personnel. The BI system’s functionality serves to enhance the data acquisition and reporting capabilities of the operational modeling system while the modeling system provides in-depth analysis of the operating planning problem.

Jeremy F. Shapiro ( is a professor emeritus in the Sloan School of Management at MIT, a former co-director of MIT’s Operations Research Center and author of the textbook, “Modeling the Supply Chain.”


1. Lohr, S., 1999, “Economy Transformed, Bit by Bit,” New York Times, Dec. 20.

2. Drucker, P., 1999, “Beyond the Information Revolution,” The Atlantic, October, pp. 47-57.

3. Erickson, W. W., 2007, “Predictive Analytics: Extending the Value of Your Data Warehousing Investment,” The Data Warehouse Institute, First Quarter, p. 5.

4. Davenport, T. H. and J. G. Harris, 2007, “Competing on Analytics: The New Science of Winning,” Harvard Business School Press. p. 7.

5. Cognos, 2008, “BI for Business Users,” January (white paper available at

6. Oco, 2008, “Comprehensive On-demand Business Intelligence Solution,” document available at .

7. Wikipedia, 2009, Data Warehouse.

8. Shapiro, J. F., 2007, “Modeling the Supply Chain,” 2nd edition, Duxbury Press.


Related Posts

  • 40
    By Gal Horvitz IoT (Internet of Things) devices have become increasingly popular in recent years. They are all around us – from fitness trackers on our wrists to smart thermostats in our homes – and adoption will only continue to grow in the coming years. In fact, Gartner, Inc. reported…
    Tags: data, business, systems, internet, management
  • 37
    Analytics isn’t a new subject, even for me. Like most professors, I did a lot of analytics in graduate school, and even had a real job as head of statistical computing at Harvard for a while. Then I lost interest for 20 years outside of an occasional correlation or regression…
    Tags: business, decision, management, knowledge, intelligence, data
  • 35
    Learn practical frameworks and systematic processes for addressing complex, real-world problems and how to facilitate effective action from Patrick Noonan, former management consultant, business owner and educator. Noonan combines a hands-on teaching and learning approach honed from faculty posts at Harvard, Duke and most recently Emory University with real-world experience…
    Tags: business, management, models, technologies, planning, data, system
  • 34
    On one end of the spectrum, labeled “Do Stuff,” organizations focus on action taking, a laissez-faire approach to project management, with little documentation and loosely defined deliverables, timelines and budgets. On the other end, labeled “Buttoned Up,” organizations take a disciplined approach to planning, monitoring and executing projects, with documentation…
    Tags: management, data, intelligence, business
  • 33
    March/April 2015 Welcome to the ‘We Economy’: Finding the business in your data By Arnab Chakraborty, Michael Svilar and Prith Banerjee (l-r) The Internet of Things (IoT) and the data it is producing are becoming main catalysts for change and transformation for businesses and consumers alike. Today, with every object,…
    Tags: data, resource, management, internet

Analytics Blog

Electoral College put to the math test

With the campaign two months behind us and the inauguration of Donald Trump two days away, isn’t it time to put the 2016 U.S. presidential election to bed and focus on issues that have yet to be decided? Of course not.



Gartner: AI technologies to be pervasive in new software products

Market hype and growing interest in artificial intelligence (AI) are pushing established software vendors to introduce AI into their product strategy, creating considerable confusion in the process, according to Gartner, Inc. Analysts predict that by 2020, AI technologies will be virtually pervasive in almost every new software product and service. Read more →

Drone delivery: Professor develops solution to minimize delays in operations

When delivery companies like FedEx, Amazon and UPS launch drones to deliver packages in the near future, one Kennesaw State University computer science professor may be at the crux of solving one of its most complicated problems. Donghyun (David) Kim, assistant professor of computer science and an expert in computer algorithm optimization, is designing a fast-running algorithm to tackle simultaneous coordination problems among multiple delivery trucks and the drones launched from them. Read more →

Tech spending growth limited to about 5 percent through 2018

Forrester predicts U.S. business and government tech spending will continue to grow by 4.8 percent through 2017 and increase to 5.2 percent in 2018. While these forecasts are higher than Forrester’s projections following the 2016 presidential election, they are lower than the expected numbers from a year ago. Read more →



Essential Practice Skills for High-Impact Analytics Projects
Sept. 26-27, Executive Conference Center, Arlington, Va.

Foundations of Modern Predictive Analytics
Oct. 2-3, VT Executive Briefing Center, Arlington, Va.

2017 INFORMS Annual Meeting
October 22-25, 2017, Houston

2017 Winter Simulation Conference (WSC 2017)
Dec. 3-6, 2017, Las Vegas


CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:

For more information, go to