End-to-End Bias Mitigation in Candidate Recommender Systems with Fairness Gates

Adam Mehdi Arafan*, David Graus, Fernando P. Santos, Emma Beauxis-Aussalet

*Corresponding author for this work

Research output: Chapter in Book / Report / Conference proceedingConference contributionAcademicpeer-review

Abstract

Recommender Systems (RS) have proven successful in a wide variety of domains, and the human resources (HR) domain is no exception. RS proved valuable for recommending candidates for a position, although the ethical implications have recently been identified as high-risk by the European Commission. In this study, we apply RS to match candidates with job requests. The RS pipeline includes two fairness gates at two different steps: pre-processing (using GAN-based synthetic candidate generation) and post-processing (with greedily searched candidate re-ranking). While prior research studied fairness at pre- and post-processing steps separately, our approach combines them both in the same pipeline applicable to the HR domain. We show that the combination of gender-balanced synthetic training data with pair re-ranking increased fairness with satisfactory levels of ranking utility. Our findings show that using only the gender-balanced synthetic data for bias mitigation is fairer by a negligible margin when compared to using real data. However, when implemented together with the pair re-ranker, candidate recommendation fairness improved considerably, while maintaining a satisfactory utility score. In contrast, using only the pair re-ranker achieved a similar fairness level, but had a consistently lower utility.

Original languageEnglish
Title of host publicationRecSys-in-HR 2022 Recommender Systems for Human Resources 2022
Subtitle of host publicationProceedings of the 2nd Workshop on Recommender Systems for Human Resources (RecSys-in-HR 2022) co-located with the 16th ACM Conference on Recommender Systems (RecSys 2022) Seattle, USA, 18th-23rd September 2022
EditorsMesut Kaya, Toine Bogers, David Graus, Sepideh Mesbah, Chris Johnson, Francisco Gutiérrez
PublisherCEUR-WS
Pages1-8
Number of pages8
Publication statusPublished - 2022
Event2nd Workshop on Recommender Systems for Human Resources, RecSys-in-HR 2022 - Seattle, United States
Duration: 18 Sept 202223 Sept 2022

Publication series

NameCEUR Workshop Proceedings
PublisherCEUR Workshop Proceedings
Volume3218
ISSN (Print)1613-0073

Conference

Conference2nd Workshop on Recommender Systems for Human Resources, RecSys-in-HR 2022
Country/TerritoryUnited States
CitySeattle
Period18/09/2223/09/22

Bibliographical note

Funding Information:
We acknowledge the University of Amsterdam - Master programme Information Studies for creating the conditions to perform this research and for financially supporting this publication.

Publisher Copyright:
© 2022 Copyright for this paper by its authors.

Funding

We acknowledge the University of Amsterdam - Master programme Information Studies for creating the conditions to perform this research and for financially supporting this publication.

Keywords

  • Fair Artificial Intelligence
  • Generative Modelling
  • Information Retrieval
  • Recommender Systems

Fingerprint

Dive into the research topics of 'End-to-End Bias Mitigation in Candidate Recommender Systems with Fairness Gates'. Together they form a unique fingerprint.

Cite this