While the uses of AI continue to expand, the public’s trust in AI has stalled. People’s faith in AI to make the right decisions for everybody is being dented by cases of bias in AI, including gender bias. That’s not surprising, given that a recent study by PwC found that only 25% of companies say that they prioritize considering the ethical implications of an AI solution before investing in it.
This is an important issue for society, for data scientists and for me personally. I have built my career on analytics, increasing their sophistication and widening their use. I know from experience that analytics can be a force for good, removing bias in lending decisions and other areas.
Fairness has always been a focus of my team’s work on the FICO® Score and related solutions. Since the 1960s, when FICO first introduced credit scoring, the company has worked with lenders, consumer groups and regulators to make sure our analytics as designed to treat people fairly. For example, we work diligently in all FICO® Scores we create to ensure the model is designed to produce scores that are not only fair and accurate, but that also can be used to empower consumers to understand their credit effectively. This is demonstrated throughout all aspects of our model development process – such as identifying appropriate data, creation of variables, and forming reason codes that can be clearly understood by consumers.
Our experience, integrity and strong partnerships with key stakeholders provides us the unparalleled ability to focus on fairness the way we do. As our chief analytics officer Scott Zoldi has written, there are many ways to inadvertently introduce bias into a model, and the risks increase with AI and machine learning. This is why there is so much focus on explainable and responsible AI, and FICO is one of the leaders in this movement.
The women at FICO have something important to say about this. For International Women’s Day 2020, I participated in this video showcasing some of our female leaders around the world discussing the importance of diversity in analytics.
I’m proud to work alongside these women — and thousands more at FICO — to ensure our solutions are designed to be fair as well as powerful. That’s how we do AI right.
As we say in the video, in a world where gender bias in AI is a serious issue, we at FICO take our role seriously. Very seriously.