Share with your friends










Submit

Analytics Magazine

Successfully operationalizing analytics

November/December 2012

A repeatable, efficient process for creating and effectively deploying predictive analytic models into production.

James TaylorBy James Taylor

Organizations are increasingly adopting predictive analytics, and they’re adopting these predictive analytics more broadly. Many are now using dozens or even hundreds of predictive analytic models. These models are now used in real-time decision-making and in operational, production systems. However many analytic teams rely on tools and approaches that will not scale to this level of adoption. These teams lack a repeatable, efficient process for creating and effectively deploying predictive analytic models into production. To succeed they need to operationalize analytics.

Operationalizing analytics requires both a repeatable, industrial-scale process for developing the dozens or hundreds of predictive analytic models that will be needed and a reliable architecture for deploying predictive analytic models into production systems.

The first step in operationalizing analytics is moving from a cottage industry to an industrial process for building analytic models. This means moving away from individual scripting environments where every task is performed by hand, reuse is limited, only small numbers of expert analytic practitioners are involved and these practitioners do not follow a standard process. Such an approach can and does produce high quality models, but it cannot scale to allow an organization to become broadly analytic in its decision-making. A more industrialized process has a number of characteristics.

Access to data is standardized based on well-defined metadata and standard sets of data about customers, products, etc. Definitions of this data are shared and analytical datasets are generated in a repeatable and increasingly automated way. This more systematic approach to data management feeds into a workbench environment for defining the modeling workflow.

Products such as SAS Enterprise Miner allow standard workflows to be developed and shared through a repository, streamlining and standardizing how modeling is performed. These workflows can use in-database mining capabilities to push data preparation, transformation and even algorithms into the data infrastructure, improving throughput by reducing data movement. In-memory and other high performance analytic capabilities, as well as intelligent automation of modeling activities, can be applied as appropriate in the steps defined in the workflows.

Predictive analytic workbenches also support the ongoing management and monitoring of models once they are in use. Workbenches allow an analytic team to set up automated monitoring of models to see when they need to be re-turned or even completely re-built. These capabilities also help track the performance of models to confirm their predictive power and behavior.

Finally, these workbenches can often be wrapped with an interface suitable for less technical users. Such features allow less technical users to build and execute workflows that take advantage of the automation capabilities to produce large numbers of good-enough models quickly. Working with an analytic team, these users can produce first-cut models and participate more fully in reviews of models, while allowing the analytic team to focus on high-value, high complexity problems.

These kinds of capabilities allow organizations to develop an industrial-scale process for developing predictive analytics. Operationalizing analytics requires this, but it also requires a more systematic focus on the use of analytics in operational systems. As organizations expand their use of predictive analytics, it becomes increasingly clear that the long-term value of predictive analytics comes in improving the quality and effectiveness of operational decisions.

Operational decisions are highly repeatable decisions about a single customer or transaction. Because they are made over and over again, these decisions create the data needed for effective predictive analytics. Because they must be made by front line staff or completely automated systems, they don’t lend themselves to analytic approaches that rely on the end user such as visualization and query tools. To influence these decisions using predictive analytics, organizations must embed executable analytic models deeply into their operational systems, driving better decisions in real-time in high-volume, transactional environments.

Many analytic teams think of themselves as done when the model is built and validated. A team focused on operationalizing analytics will instead focus on the point at which operational decisions are being made more effectively thanks to analytics embedded in the systems used to make those decisions. This requires a change in focus and the use of some more modern technologies.

One way to make predictive analytic models available in operations is to use in-database scoring infrastructure, which takes models and pushes them directly into the core of an organization’s operational data stores. Once deployed, the models are available as a function and can be included in views or stored procedures. This allows operational systems direct access to the result of the model while ensuring that this is calculated live, when requested, and not based on a potentially out of date batch run.

Predictive analytic models can also be deployed directly into operational environments. The increasing adoption of the Predictive Model Markup Language (PMML), an XML standard for defining predictive analytic models, means that a model can be generated from an analytic workbench and then deployed into a variety of environments. Many business rules and business process management systems, for instance, support PMML. This allows them to load the definition of a model directly and then execute it when the rules or process need to know the prediction.

In addition many applications for fraud detection credit risk management or customer intelligence have been designed to allow new models to be rapidly integrated into operational systems. As more packaged applications tackle decision-making, this capability will only become more widespread, giving analytic teams more deployment options and further reducing the barriers to using predictive analytics in operations.

To broadly and effectively adopt predictive analytics organizations must operationalize analytics. Operationalize analytics requires an industrialized process for building predictive analytic models using technology that emphasizes automation, collaboration and reuse combined with a focus on rapid deployment of predictive analytic models into operational systems.

Note: For a complimentary copy of James Taylor’s white paper on “Operationalizing Analytics,” click here.

James Taylor is CEO of Decision Management Solutions (www.decisionmanagementsolutions.com). A consultant and leading expert in decision management, Taylor is also a writer, speaker and a faculty member of the International Institute for Analytics.

business analytics news and articles



Headlines

Does negative political advertising actually work?

While many potential voters dread campaign season because of pervasive negative political advertising, a new study has found that negative political advertising actually works, but perhaps not in the way that many may assume. The study “A Border Strategy Analysis of Ad Source and Message Tone in Senatorial Campaigns,” which will be published in the June edition of INFORMS’ journal Marketing Science, is co-authored by Yanwen Wang of the University of British Columbia in Vancouver, Michael Lewis of Emory University in Atlanta and David A. Schweidel of Georgetown University in Washington, D.C. Read more →

Meet Summit, world’s most powerful, smartest scientific supercomputer

The U.S. Department of Energy’s Oak Ridge National Laboratory on June 8 unveiled Summit as the world’s most powerful and smartest scientific supercomputer. With a peak performance of 200,000 trillion calculations per second – or 200 petaflops – Summit will be eight times more powerful than ORNL’s previous top-ranked system, Titan. For certain scientific applications, Summit will also be capable of more than three billion billion mixed precision calculations per second, or 3.3 exaops. Read more →

Employee engagement a top concern affecting customer experience

Employee engagement has surfaced as a major concern in delivering improvements in customer experience (CX), with 86 percent of CX executives in a Gartner, Inc. survey ranking it as having an equal or greater impact than other factors such as project management and data skills. “CX is a people issue,” says Olive Huang, research vice president at Gartner. “In some instances, the best technology investments have been derailed by employee factors, such as a lack of training or incentives, low morale or commitment, and poor communication of goals." Read more →

UPCOMING ANALYTICS EVENTS

INFORMS-SPONSORED EVENTS

INFORMS Annual Meeting
Nov. 4-7, 2018, Phoenix

OTHER EVENTS

Making Data Science Pay
July 30-31, 12:30 p.m.-5 p.m.


Predictive Analytics: Failure to Launch Webinar
Aug. 18, 11 a.m.


Applied AI & Machine Learning | Comprehensive
Sept. 10-13, 17-20 and 24-25


Advancing the Analytics-Driven Organization
Sept. 17-20, 12-5 p.m. LIVE Online


The Analytics Clinic: Ensemble Models: Worth the Gains?
Sept. 20, 11 a.m. -12:30 p.m.

CAP® EXAM SCHEDULE

CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:


 
For more information, go to 
https://www.certifiedanalytics.org.