Share with your friends










Submit

Analytics Magazine

Successfully operationalizing analytics

November/December 2012

A repeatable, efficient process for creating and effectively deploying predictive analytic models into production.

James TaylorBy James Taylor

Organizations are increasingly adopting predictive analytics, and they’re adopting these predictive analytics more broadly. Many are now using dozens or even hundreds of predictive analytic models. These models are now used in real-time decision-making and in operational, production systems. However many analytic teams rely on tools and approaches that will not scale to this level of adoption. These teams lack a repeatable, efficient process for creating and effectively deploying predictive analytic models into production. To succeed they need to operationalize analytics.

Operationalizing analytics requires both a repeatable, industrial-scale process for developing the dozens or hundreds of predictive analytic models that will be needed and a reliable architecture for deploying predictive analytic models into production systems.

The first step in operationalizing analytics is moving from a cottage industry to an industrial process for building analytic models. This means moving away from individual scripting environments where every task is performed by hand, reuse is limited, only small numbers of expert analytic practitioners are involved and these practitioners do not follow a standard process. Such an approach can and does produce high quality models, but it cannot scale to allow an organization to become broadly analytic in its decision-making. A more industrialized process has a number of characteristics.

Access to data is standardized based on well-defined metadata and standard sets of data about customers, products, etc. Definitions of this data are shared and analytical datasets are generated in a repeatable and increasingly automated way. This more systematic approach to data management feeds into a workbench environment for defining the modeling workflow.

Products such as SAS Enterprise Miner allow standard workflows to be developed and shared through a repository, streamlining and standardizing how modeling is performed. These workflows can use in-database mining capabilities to push data preparation, transformation and even algorithms into the data infrastructure, improving throughput by reducing data movement. In-memory and other high performance analytic capabilities, as well as intelligent automation of modeling activities, can be applied as appropriate in the steps defined in the workflows.

Predictive analytic workbenches also support the ongoing management and monitoring of models once they are in use. Workbenches allow an analytic team to set up automated monitoring of models to see when they need to be re-turned or even completely re-built. These capabilities also help track the performance of models to confirm their predictive power and behavior.

Finally, these workbenches can often be wrapped with an interface suitable for less technical users. Such features allow less technical users to build and execute workflows that take advantage of the automation capabilities to produce large numbers of good-enough models quickly. Working with an analytic team, these users can produce first-cut models and participate more fully in reviews of models, while allowing the analytic team to focus on high-value, high complexity problems.

These kinds of capabilities allow organizations to develop an industrial-scale process for developing predictive analytics. Operationalizing analytics requires this, but it also requires a more systematic focus on the use of analytics in operational systems. As organizations expand their use of predictive analytics, it becomes increasingly clear that the long-term value of predictive analytics comes in improving the quality and effectiveness of operational decisions.

Operational decisions are highly repeatable decisions about a single customer or transaction. Because they are made over and over again, these decisions create the data needed for effective predictive analytics. Because they must be made by front line staff or completely automated systems, they don’t lend themselves to analytic approaches that rely on the end user such as visualization and query tools. To influence these decisions using predictive analytics, organizations must embed executable analytic models deeply into their operational systems, driving better decisions in real-time in high-volume, transactional environments.

Many analytic teams think of themselves as done when the model is built and validated. A team focused on operationalizing analytics will instead focus on the point at which operational decisions are being made more effectively thanks to analytics embedded in the systems used to make those decisions. This requires a change in focus and the use of some more modern technologies.

One way to make predictive analytic models available in operations is to use in-database scoring infrastructure, which takes models and pushes them directly into the core of an organization’s operational data stores. Once deployed, the models are available as a function and can be included in views or stored procedures. This allows operational systems direct access to the result of the model while ensuring that this is calculated live, when requested, and not based on a potentially out of date batch run.

Predictive analytic models can also be deployed directly into operational environments. The increasing adoption of the Predictive Model Markup Language (PMML), an XML standard for defining predictive analytic models, means that a model can be generated from an analytic workbench and then deployed into a variety of environments. Many business rules and business process management systems, for instance, support PMML. This allows them to load the definition of a model directly and then execute it when the rules or process need to know the prediction.

In addition many applications for fraud detection credit risk management or customer intelligence have been designed to allow new models to be rapidly integrated into operational systems. As more packaged applications tackle decision-making, this capability will only become more widespread, giving analytic teams more deployment options and further reducing the barriers to using predictive analytics in operations.

To broadly and effectively adopt predictive analytics organizations must operationalize analytics. Operationalize analytics requires an industrialized process for building predictive analytic models using technology that emphasizes automation, collaboration and reuse combined with a focus on rapid deployment of predictive analytic models into operational systems.

Note: For a complimentary copy of James Taylor’s white paper on “Operationalizing Analytics,” click here.

James Taylor is CEO of Decision Management Solutions (www.decisionmanagementsolutions.com). A consultant and leading expert in decision management, Taylor is also a writer, speaker and a faculty member of the International Institute for Analytics.

business analytics news and articles



Headlines

Report: One in five cloud-based user accounts may be fake

According to the Q2 2018 DataVisor Fraud Index Report, more than one in five user accounts set up through cloud service providers may be fraudulent. The report, based on information gathered between April and June, analyzes 1.1 billion active user accounts, 1.5 million email domains, 231,000 device types and 562 cloud hosting providers and data centers, among other indicators. Read more →

When managers respond to online critics, more negative reviews ensue

A new study in the INFORMS journal Marketing Science found that when managers respond to online reviews it’s possible that those responses could actually stimulate additional reviewing activity and an increased number of negative reviews. The study, “Channels of Impact: User Reviews When Quality is Dynamic and Managers Respond,” is authored by Judith Chevalier of the Yale School of Management and NBER, Yaniv Dover of the Hebrew University of Jerusalem and Dina Mayzlin of the Marshal School of Business at the University of Southern California. Read more →

IE student designs software to optimize snow removal at Penn State

It is well known among the State College and Penn State communities that it takes a lot for university officials to shut the campus down after a major snowfall. In fact, since 2010, the University Park campus has been shut down just three full days due to snowfall. Much to the chagrin of students – and faculty and staff – the snow day at Penn State may just have become even more elusive, thanks to software developed by recent industrial engineering graduate Achal Goel. Read more →

UPCOMING ANALYTICS EVENTS

INFORMS-SPONSORED EVENTS

Winter Simulation Conference
Dec. 9-12, 2018, Gothenburg, Sweden

INFORMS Computing Society Conference
Jan. 6-8, 2019; Knoxville, Tenn.

INFORMS Conference on Business Analytics & Operations Research
April 14-16, 2019; Austin, Texas

INFORMS International Conference
June 9-12, 2019; Cancun, Mexico

INFORMS Marketing Science Conference
June 20-22; Rome, Italy

INFORMS Applied Probability Conference
July 2-4, 2019; Brisbane, Australia

INFORMS Healthcare Conference
July 27-29, 2019; Boston, Mass.

2019 INFORMS Annual Meeting
Oct. 20-23, 2019; Seattle, Wash.

Winter Simulation Conference
Dec. 8-11, 2019: National Harbor, Md.

OTHER EVENTS

Applied AI & Machine Learning | Comprehensive
Dec. 3, 2018 (live online)


Advancing the Analytics-Driven Organization
Jan. 28–31, 2019, 1 p.m.– 5 p.m. (live online)

CAP® EXAM SCHEDULE

CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:


 
For more information, go to 
https://www.certifiedanalytics.org.