The Fairness of Credit Scoring Models.

Authors Publication date
2021
Publication type
Journal Article
Summary In credit markets, screening algorithms discriminate between good-type and bad-type borrowers. This is their raison d’être. However, by doing so, they also often discriminate between individuals sharing a protected attribute (e.g. gender, age, race) and the rest of the population. In this paper, we show how to test (1) whether there exists a statistical significant difference in terms of rejection rates or interest rates, called lack of fairness, between protected and unprotected groups and (2) whether this difference is only due to credit worthiness. When condition (2) is not met, the screening algorithm does not comply with the fair-lending principle and can be qualified as illegal. Our framework provides guidance on how algorithmic fairness can be monitored by lenders, controlled by their regulators, and improved for the benefit of protected groups.
Publisher
Elsevier BV
Topics of the publication
  • ...
  • No themes identified
Themes detected by scanR from retrieved publications. For more information, see https://scanr.enseignementsup-recherche.gouv.fr