Sheilla Njoto, M. Cheong, Reeva M. Lederman, A. McLoughney, L. Ruppanner, Anthony Wirth
{"title":"Gender Bias in AI Recruitment Systems: A Sociological-and Data Science-based Case Study","authors":"Sheilla Njoto, M. Cheong, Reeva M. Lederman, A. McLoughney, L. Ruppanner, Anthony Wirth","doi":"10.1109/ISTAS55053.2022.10227106","DOIUrl":null,"url":null,"abstract":"This paper explores the extent to which gender bias is introduced in the deployment of automation for hiring practices. We use an interdisciplinary methodology to test our hypotheses: observing a human-led recruitment panel and building an explainable algorithmic prototype from the ground up, to quantify gender bias. The key findings of this study are threefold: identifying potential sources of human bias from a recruitment panel’s ranking of CVs; identifying sources of bias from a potential algorithmic pipeline which simulates human decision making; and recommending ways to mitigate bias from both aspects. Our research has provided an innovative research design that combines social science and data science to theorise how automation may introduce bias in hiring practices, and also pinpoint where it is introduced. It also furthers the current scholarship on gender bias in hiring practices by providing key empirical inferences on the factors contributing to bias.","PeriodicalId":180420,"journal":{"name":"2022 IEEE International Symposium on Technology and Society (ISTAS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Symposium on Technology and Society (ISTAS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISTAS55053.2022.10227106","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
This paper explores the extent to which gender bias is introduced in the deployment of automation for hiring practices. We use an interdisciplinary methodology to test our hypotheses: observing a human-led recruitment panel and building an explainable algorithmic prototype from the ground up, to quantify gender bias. The key findings of this study are threefold: identifying potential sources of human bias from a recruitment panel’s ranking of CVs; identifying sources of bias from a potential algorithmic pipeline which simulates human decision making; and recommending ways to mitigate bias from both aspects. Our research has provided an innovative research design that combines social science and data science to theorise how automation may introduce bias in hiring practices, and also pinpoint where it is introduced. It also furthers the current scholarship on gender bias in hiring practices by providing key empirical inferences on the factors contributing to bias.