Share with your friends


Analytics Magazine

Evolving regulations put new demands on model management

March/April 2012

Best practices for adapting to the new banking environment.

Andrew JenningsBy Andrew Jennings

The benefits of analytical models and sound model management practices are well-documented: improved decision-making ability and speed, more efficient business operations, better top- and bottom-line revenues and greater control over risk. Making smarter decisions and driving more efficiency and profitability has never been of more value to business, and the economic recession and the constant push from C-suites everywhere to “do more with less” have made getting the maximum results from models even more critical.

In the banking industry, regulators have responded to the economic crisis by strengthening the requirements for model management. This makes sense given the ubiquity of models and their central role in credit policy and decisions. These regulations already have had a significant impact on how banks develop and deploy models, but the often-distributed teams and business within the bank make it hard for them to respond in a coherent fashion, in turn making it more difficult for them to simultaneously comply and deliver business results.

At the same time, best practices for managing business models in this difficult economic and regulatory climate have begun to emerge. Often the guidelines and policies for model management are standard fare, and already well understood – what banks often lack is the ability to carry out these policies effectively as model volumes reach the hundreds or thousands. By adapting to regulatory changes and finding the right mix of model centralization and automation, organizations can achieve cheaper and easier compliance, make smarter decisions and improve their bottom line.

Evolving regulations put new demands on model management

New, Stricter Regulations Take Effect

Two regulatory bodies have issued mandates that have significantly impacted banks and their business models. First, in the United States, the Board of Governors of the Federal Reserve System (the Fed) and the Office of the Comptroller of the Currency (OCC) issued expectations in 2011 for banks that use quantitative models in any aspect of their business, acknowledging that in recent years banks have come to rely heavily on these models as part of their credit policies and decisions and applied them to more complex products and with more ambitious scope. The Feds insisted that banks do a better job of managing and minimizing their “model risk” or the potential for damage when models play a material role in bank decision-making. This damage can occur when the models are not performing at their peak, or when they are not well used within credit policies. The regulations further stipulate a number of practices for model management, including model validation, strong governance, internal audit coverage and clear internal policies and documentation.

Second, the global Basel Committee on Banking Supervision, through its most recent Basel III and other mandates, has sharpened its focus on model validation. Basel now requires financial institutions that calculate capital requirements to fully understand how the models they use are developed, conduct regular and ongoing validation of those models, and prove they are responding to findings and adjusting their models as needed based on their analysis of that data.

One common point of emphasis throughout the new banking regulations is finding and correcting current weaknesses in bank models, so that they deliver the accuracy and precision that stakeholders demand. Models have been increasingly used in banking to measure capital reserve requirements and manage complex decisions across the credit account lifecycle. By fixing weaknesses in existing models, regulators hope to ensure individual banks are healthy and resilient and that consumer investments and trust in those banks are as secure as possible.

Impact of Regulations on Businesses

While these new regulatory guidelines have been introduced, they often are just that – guidelines. The onus, then, falls on financial institutions to develop rigorous model and decision management frameworks needed to satisfy the requirements.

The more models that banks use, and the greater their complexity, the harder it is to ensure they all comply with the latest regulations. In addition, while core Basel principles are global and apply to all financial institutions in the same manner, there are nuances in how individual countries adopt the rules to fit their needs – and furthermore, in how the rules are interpreted and applied by individual regulators. As such, another layer of difficulty falls on internationally distributed financial institutions, as they must be able to demonstrate adherence to the many potential variations of regulations by geography and by auditor as part of their compliance strategy.

On the whole, banks have greatly struggled to adopt and fully comply with new modeling regulations. The compliance burden typically falls to IT departments, who in these lean economic times are under more stress as banks have consolidated and cut back on their staff and other resources. This often leaves IT under-resourced and overworked, even before compliance activities come their way.

Organizations also face several other challenges in managing their models. Maintaining transparency in models is often difficult as more of them are introduced, they grow in complexity and they are applied to decisions. Yet models need to be easy to understand, defend and explain, as this proves beneficial when having internal discussions, facing regulatory reviews or explaining to a customer why, for example, a loan was denied. Second, many institutions lack model inventories or reporting schedules, but they should have related mechanisms in place because they are useful for monitoring high-impact models and identifying the ones most in need of review and redevelopment. Also, banks often aren’t able to devote the proper time and effort to the documentation and administration that their models require. The more banks can standardize and automate workflows, reports and the like, the better and faster organizations can demonstrate compliance and respond to management about matters critical to the business.

Best Practices for Adapting to New Modeling Regulations

A best practice for adapting to new model regulations is to build an organizational policy for good, comprehensive model and credit policy management. The framework should have eight key components, because these are all on the watch lists of regulators as they do their audit checks. Many of these are tried and trusted as sound aspects of credit policy and related decision-making.

First, banks should make sure they have a clearly stated credit policy and review that policy regularly. These policies are critical to understanding the effectiveness of models and decisions, and they have a direct impact on the bottom line. In the United States, the Fed and OCC require a review of policies at least annually, so it’s good business practice to review them every six months. The most thorough reviews ask the following questions, and document the answers and reasons behind them:

  • Does each policy serve a purpose? Policies typically become part of corporate cultures, so reviews are the proper time to ensure they are useful in the current business environment and up to date with the latest regulatory requirements.
  • Are policies defensible? Are they truly indicative of risk?
  • Are policies consistent throughout an account lifecycle and across channels?
  • Are policies redundant, or do gaps exist? The goal, of course, is to eliminate redundancy, reduce contradictions and identify gaps or overlaps that can be filled so that risk can be properly mitigated.

Second, organizations should be sure to prepare a suitable data sample. Regulators require that model validation sampling techniques are complete, responsible and relevant, because incorrect or inaccurate sampling can impact model performance. As most modelers are aware, initial validations should use samples that are independent of the development sample, so that more realistic benchmarks for in-production performance can be realized. Then, ongoing model validations should use samples that are of adequate size and representative of all subpopulations of interest – banks complying with Basel II will already be able to demonstrate this.

Third, organizations need to ensure model segmentation transparency. The key to successful segmentation is identifying the right variables to split a population into actionable segments that make statistical and business sense. Automated tools and techniques now can make this process much faster and easier, and the best solutions for this deliver optimized segmentation schemes that improve model precision while maintaining transparency. In general, organizations need to clearly document how they segmented the subpopulations within their portfolios and how their segmentation supports business objectives.

The fourth step for organizations to ensure comprehensive, compliant model management is to choose the right model type. Certain decisions will require models with greater transparency for regulators – for example, credit risk models generally require a greater level of transparency than fraud models, which is why many fraud models are built using neural networks, a technology that makes reporting about specific characteristics more difficult. Choose a model based on data type and how effective it will be for a particular decision. The best risk models are sensible to regulators and customers and able to be easily re-engineered or fine-tuned if needed to ensure they map directly to business goals.

Evolving regulations put new demands on model management

Fifth, organizations need to diligently track performance in terms of both statistical and business effectiveness. Regulators expect organizations to monitor their models continually so they can be recalibrated or rebuilt as necessary to ensure proper performance. The following standard types of reports are generally useful:

  • Population stability reports, which can indicate shifts in score distributions that could be indicative of changing model effectiveness on that population;
  • Characteristic analysis reports, which help explain why shifts in score distribution may be taking place, tracing back to the predictive inputs of the model;
  • Delinquency distribution reports, which illustrate a scorecard’s effectiveness at rank ordering accounts by risk;
  • Vintage analysis reports, which compare delinquency distribution reports in an effort to spot trends and identify sources of portfolio quality changes as early as possible;
  • Odds-to-score reports, which monitor shifts and rotations in probabilities of negative outcomes per score band; and
  • Time series of all key metrics reports, which detect overall trends in model stability, separation and accuracy.

Ultimately, these largely statistical analyses must also be connected to their business impacts. For example, shifts indicated in the population stability report may indicate declining credit quality, which may in turn call for operational adjustments to decision cut-offs, re-assessment of loss reserves or changes in staffing for case review.

Evolving regulations put new demands on model management

Sixth, organizations need to be able to explain and defend their decision strategies. Strategies are the deterministic rule flows used to automate the business’s high-volume decisions, often expressed as decision tables, decision trees or rule sets calling on combinations of such decision objects. These strategies clearly illustrate the role that predictive models play within the decision. Such strategies can become very complex and therefore difficult to summarize and compare to understand how different groups are treated from one strategy to another and the basis of that variation.

Regulators will ask for empirical evidence to justify an organization’s decision strategies, and the better these are carefully documented up front the easier it is later to ensure compliance. Most importantly, regulators will want to know how organizations manage between increasing profits and containing risk, and what their realized gains, losses and exposures are as a result of their decision strategies. Automated solutions that can track, simplify and analyze these items and simulate any number of related scenarios that regulators might ask for are making the defense process much simpler today. They also bring additional value because they help banks review and re-evaluate the wisdom of their rules, as opposed to just comparing and contrasting them.

The seventh step for organizations to ensure comprehensive, compliant model management is monitoring their overrides. As its name implies, an override occurs whenever the decision taken runs counter to the action recommended by the automated decision strategy. For example, if the decision rules determine that a loan application should have been declined due to a low credit score, but in fact a loan offer was extended, this is often called a low-side override. If overrides occur, regulators will require documentation and monitoring of those decisions. Establish clear and consistent guidelines for overrides as part of the organizational credit policy, as well as clear identifying codes for evaluating decisions. Then examine the reasons for high-side and low-side overrides to make sure they are being done according to organizational policy. Your credit policy should state the guidelines for making an override.

Finally, the eighth step is to thoroughly document the model development and model monitoring processes. They should keep track of everything and build a detailed inventory of every model in their operating environment along with their purposes, usage, restrictions, inputs, performance, updates, owners within the organization and audit history. Such an inventory also needs to be easily searchable, so that it can produce any particular piece of information in a timely fashion to satisfy a request.

Ensuring compliance to new regulations, and doing what’s best for an organization’s stability and growth, requires stronger models that drive better decisions and improve business results while also automating the process as much as possible and easing the burden on already overstressed resources. Once an organization has the people, processes and technology in place for proper model management and validation, it will be in a better position to adapt to the constantly changing business environment – and thrive within it.

Andrew Jennings is chief analytics officer at FICO and head of FICO Labs. To read more commentary from Dr. Jennings and other FICO experts, visit

business analytics news and articles


Using machine learning and optimization to improve refugee integration

Andrew C. Trapp, a professor at the Foisie Business School at Worcester Polytechnic Institute (WPI), received a $320,000 National Science Foundation (NSF) grant to develop a computational tool to help humanitarian aid organizations significantly improve refugees’ chances of successfully resettling and integrating into a new country. Built upon ongoing work with an international team of computer scientists and economists, the tool integrates machine learning and optimization algorithms, along with complex computation of data, to match refugees to communities where they will find appropriate resources, including employment opportunities. Read more →

Gartner releases Healthcare Supply Chain Top 25 rankings

Gartner, Inc. has released its 10th annual Healthcare Supply Chain Top 25 ranking. The rankings recognize organizations across the healthcare value chain that demonstrate leadership in improving human life at sustainable costs. “Healthcare supply chains today face a multitude of challenges: increasing cost pressures and patient expectations, as well as the need to keep up with rapid technology advancement, to name just a few,” says Stephen Meyer, senior director at Gartner. Read more →

Meet CIMON, the first AI-powered astronaut assistant

CIMON, the world’s first artificial intelligence-enabled astronaut assistant, made its debut aboard the International Space Station. The ISS’s newest crew member, developed and built in Germany, was called into action on Nov. 15 with the command, “Wake up, CIMON!,” by German ESA astronaut Alexander Gerst, who has been living and working on the ISS since June 8. Read more →



INFORMS Computing Society Conference
Jan. 6-8, 2019; Knoxville, Tenn.

INFORMS Conference on Business Analytics & Operations Research
April 14-16, 2019; Austin, Texas

INFORMS International Conference
June 9-12, 2019; Cancun, Mexico

INFORMS Marketing Science Conference
June 20-22; Rome, Italy

INFORMS Applied Probability Conference
July 2-4, 2019; Brisbane, Australia

INFORMS Healthcare Conference
July 27-29, 2019; Boston, Mass.

2019 INFORMS Annual Meeting
Oct. 20-23, 2019; Seattle, Wash.

Winter Simulation Conference
Dec. 8-11, 2019: National Harbor, Md.


Advancing the Analytics-Driven Organization
Jan. 28–31, 2019, 1 p.m.– 5 p.m. (live online)


CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:

For more information, go to