CJEU Ruling Highlights the Need for Explainability in Analytics

Black box models will be even more difficult to justify after a recent European ruling - explainable AI and reason codes with credit scores are critical

On December 7th, 2023, the Court of Justice of the European Union (CJEU, Case C-634/21) ruled that credit decisions primarily relying on a credit bureau-provided score value are in violation of the General Data Protection Regulation (GDPR). While the specific case is related to Germany’s leading credit bureau, the decision is of pan-European relevance, which is emphasized by the fact that the governments of Germany, Denmark, Portugal, Finland as well as the European Commission had submitted opinions to the court.

The CJEU highlights that automated decisions with material impact on the “data subject” (i.e., the customer) require the use of “appropriate mathematical or statistical procedures” and “technical and organisational measures appropriate to ensure that the risk of errors is minimised and inaccuracies are corrected.” The court also stresses the customer’s right “to obtain human intervention […] to express his or her point of view and to challenge the decision taken in his or her regard.”

Ulrich Wiesner quote

related decision has been made by the Berlin Data Protection Authority earlier this year. It fined a major German direct bank for being unable to explain the reasons why certain credit card applications were declined. The Data Protection Authority highlighted that the affected customers are entitled to an explanation of the reasons of an averse decision and the involved logic behind the automated decision.

In this ruling and the CJEU ruling, explainability and interpretability are at issue. The regulators are taking a stand against the black box.

Where automated decisions have a material impact on customers, like decisions to grant credit, the need for explainability and interpretability is deeply embedded in GDPR and the underlying desire to protect customers from biased, unfair or erratic decisions. The CJEU decision affects organisations that are prioritising the perceived protection of their trade secrets (the black box approach) over the transparency, explainability, interpretability and auditability of automated decisions. 

At FICO, we have prioritised explainability and interpretability for many years. Our Chief Analytics Officer, Scott Zoldi, has made explainable AI (XAI) his mission for quite some time. Our software is designed to support transparency and interpretable decisions, from analytic model development (including machine learning models, neural networks and other model types) to model execution. With traditional scorecards, we have put emphasis on model explainability and the provision of reason codes at execution time for a long time, in support of interpretable credit decisions. With our explainable AI toolkit (xAI) we bring transparency to machine learning models and individual predictions across tree ensemble models and neural networks. When operationalising decision strategies, we embed decision explainability in the execution layer of both predictive models and business logic. This allows businesses to understand, explain and justify automated decisions, as well as to identify where models or business rules need to be improved or corrected. 

In North America, where FICO Scores are the gold standard for consumer credit risk ratings, FICO has put substantial effort into explaining each individual score value with reason codes, as well as educating the public about the underlying concepts and principles. While the concept of explainable AI may be relatively new, we have been set on making credit scores explainable for decades.

The age of black box AI is over. Explainability always has been and continues to be best practice. Credit decisions, credit scores and analytic models deserve transparency. Your customers, even if declined, deserve an explanation. Inappropriate and unfair decisions require correction. Explainable models and transparent decisions help you to understand your decisions better, to avoid bias and errors, and ultimately to make better decisions. They are worth your effort.  

How FICO Helps You Develop Explainable Analytics

 

chevron_left Blog home
RELATED POSTS

Take the next step

Connect with FICO for answers to all your product and solution questions. Interested in becoming a business partner? Contact us to learn more. We look forward to hearing from you.