Model Management: When Compliance Undercuts Performance
For many banks implementing model management practices, ensuring compliance and managing model effectiveness are often at odds. Case in point: An August 2014 article by Butler Anal…

For many banks implementing model management practices, ensuring compliance and managing model effectiveness are often at odds. Case in point: An August 2014 article by Butler Analytics reports that “the modeling staff in one major US bank now spend 80% of their time meeting regulatory requirements, detracting from much needed new model development.”
While this case is extreme, every bank is seeing increased demand for the services of their analytics teams. An executive in the mortgage division of a banking client told us that for a period of time, 25% of its analytics workforce had to be diverted to collecting, preparing and reporting on data required by regulators—costing the bank tens of millions of dollars on labor costs alone.
The good news is that some banks are moving ahead of the curve. They’re improving their ability to answer questions about analytics while lightening the burden on their analytics teams. In fact, a bank devoting 80% of modeler time to regulatory requirements could reduce that expenditure to 20% or less.
This more efficient, streamlined approach combines centralized model management with automated, configurable workflow tools. As depicted in the graphic below, workflows enforce approved processes at every stage of the model lifecycle. Built into the process is the capture of key details, decisions and approvals made throughout the model lifecycle.
The central repository maintains an inventory of all models in operations and under development—their purpose, data types/sources, key assumptions, exclusions, predictive characteristics, segmentation schemes, where model is being used, restrictions on usage, etc. The model management solution also automates scheduled validations and flags models for review when stability or performance metrics decline. This makes it much easier for banks to maintain models at peak performance, as well as spot and resolve any compliance issues.
Through the central repository, interested parties across the organization have detailed information at hand to respond to internal or regulator queries, such as why a particular segmentation scheme was selected for a model. They can readily justify why a new predictive characteristic was added during a model refresh and how it was developed. They can provide regulators with evidence that a rigorous analysis of all potential economic drivers was performed for Long Run and Downturn PD estimation.
For more information on this approach to model management, I'll once again plug our latest Insights white paper, “Reducing Regulatory Drag on Analytics Teams” (No. 82).
Popular Posts

Business and IT Alignment is Critical to Your AI Success
These are the five pillars that can unite business and IT goals and convert artificial intelligence into measurable value — fast
Read more
Average U.S. FICO Score at 717 as More Consumers Face Financial Headwinds
Outlier or Start of a New Credit Score Trend?
Read more
FICO® Score 10 T Decisively Beats VantageScore 4.0 on Predictability
An analysis by FICO data scientists has found that FICO Score 10 T significantly outperforms VantageScore 4.0 in mortgage origination predictive power.
Read moreTake the next step
Connect with FICO for answers to all your product and solution questions. Interested in becoming a business partner? Contact us to learn more. We look forward to hearing from you.