{"title":"Dexer: Detecting and Explaining Biased Representation in Ranking","authors":"Y. Moskovitch, Jinyang Li, H. Jagadish","doi":"10.1145/3555041.3589725","DOIUrl":null,"url":null,"abstract":"With the growing use of ranking algorithms in real-life decision-making purposes, fairness in ranking has been recognized as an important issue. Recent works have studied different fairness measures in ranking, and many of them consider the representation of different \"protected groups\", in the top-k ranked items, for any reasonable k. Given the protected groups, confirming algorithmic fairness is a simple task. However, the groups' definitions may be unknown in advance. To this end, we present Dexer, a system for the detection of groups with biased representation in the top-k. Dexer utilizes the notion of Shapley values to provide the users with visual explanations for the cause of bias. We will demonstrate the usefulness of Dexer using real-life data.","PeriodicalId":161812,"journal":{"name":"Companion of the 2023 International Conference on Management of Data","volume":"75 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Companion of the 2023 International Conference on Management of Data","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3555041.3589725","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
With the growing use of ranking algorithms in real-life decision-making purposes, fairness in ranking has been recognized as an important issue. Recent works have studied different fairness measures in ranking, and many of them consider the representation of different "protected groups", in the top-k ranked items, for any reasonable k. Given the protected groups, confirming algorithmic fairness is a simple task. However, the groups' definitions may be unknown in advance. To this end, we present Dexer, a system for the detection of groups with biased representation in the top-k. Dexer utilizes the notion of Shapley values to provide the users with visual explanations for the cause of bias. We will demonstrate the usefulness of Dexer using real-life data.