Practical bias correction in neural networks: a credit default prediction case study

Research output: Contribution to JournalArticleAcademicpeer-review

Abstract

Artificial intelligence (AI) is increasingly being used for decision-making. Technological developments have significantly increased the performance of AI models but have also increased their complexity. As a result, IT professionals are struggling to develop fair AI implementations. Using a case study, we demonstrate how both gender and age bias can be addressed in practice. We do this by developing a credit default prediction model and detecting and mitigating both age and gender bias within this model. A neural network was trained using a real world credit data set from Taiwan. Existing ’bias’ in the data set and bias introduced by this initial model was measured using a combination of previously published methods. A corrected model was created by training and evaluating a series of models to control bias along multiple dimensions. The final model eliminates the measured bias with- out sacrificing accuracy. It uses a top-down post-processing technique focusing on an equal increase of the default rate per group.
Original languageEnglish
Article number3
Pages (from-to)1
Number of pages12
JournalComputers and Society Research Journal
Volume2022
Issue number3
DOIs
Publication statusPublished - 1 May 2022

Fingerprint

Dive into the research topics of 'Practical bias correction in neural networks: a credit default prediction case study'. Together they form a unique fingerprint.

Cite this