S. Radovanović, A. Petrović, Zorica Dodevska, Boris Delibasic
{"title":"FairAW -无歧视的加性加权","authors":"S. Radovanović, A. Petrović, Zorica Dodevska, Boris Delibasic","doi":"10.3233/ida-226898","DOIUrl":null,"url":null,"abstract":"With growing awareness of the societal impact of decision-making, fairness has become an important issue. More specifically, in many real-world situations, decision-makers can unintentionally discriminate a certain group of individuals based on either inherited or appropriated attributes, such as gender, age, race, or religion. In this paper, we introduce a post-processing technique, called fair additive weighting (FairAW) for achieving group and individual fairness in multi-criteria decision-making methods. The methodology is based on changing the score of an alternative by imposing fair criteria weights. This is achieved through minimization of differences in scores of individuals subject to fairness constraint. The proposed methodology can be successfully used in multi-criteria decision-making methods where the additive weighting is used to evaluate scores of individuals. Moreover, we tested the method both on synthetic and real-world data, and compared it to Disparate Impact Remover and FA*IR methods that are commonly used in achieving fair scoring of individuals. The obtained results showed that FairAW manages to achieve group fairness in terms of statistical parity, while also retaining individual fairness. Additionally, our approach managed to obtain the best equality in scoring between discriminated and privileged groups.","PeriodicalId":50355,"journal":{"name":"Intelligent Data Analysis","volume":"24 1","pages":"1023-1045"},"PeriodicalIF":0.9000,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"FairAW - Additive weighting without discrimination\",\"authors\":\"S. Radovanović, A. Petrović, Zorica Dodevska, Boris Delibasic\",\"doi\":\"10.3233/ida-226898\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"With growing awareness of the societal impact of decision-making, fairness has become an important issue. More specifically, in many real-world situations, decision-makers can unintentionally discriminate a certain group of individuals based on either inherited or appropriated attributes, such as gender, age, race, or religion. In this paper, we introduce a post-processing technique, called fair additive weighting (FairAW) for achieving group and individual fairness in multi-criteria decision-making methods. The methodology is based on changing the score of an alternative by imposing fair criteria weights. This is achieved through minimization of differences in scores of individuals subject to fairness constraint. The proposed methodology can be successfully used in multi-criteria decision-making methods where the additive weighting is used to evaluate scores of individuals. Moreover, we tested the method both on synthetic and real-world data, and compared it to Disparate Impact Remover and FA*IR methods that are commonly used in achieving fair scoring of individuals. The obtained results showed that FairAW manages to achieve group fairness in terms of statistical parity, while also retaining individual fairness. Additionally, our approach managed to obtain the best equality in scoring between discriminated and privileged groups.\",\"PeriodicalId\":50355,\"journal\":{\"name\":\"Intelligent Data Analysis\",\"volume\":\"24 1\",\"pages\":\"1023-1045\"},\"PeriodicalIF\":0.9000,\"publicationDate\":\"2023-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Intelligent Data Analysis\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.3233/ida-226898\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Intelligent Data Analysis","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.3233/ida-226898","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
FairAW - Additive weighting without discrimination
With growing awareness of the societal impact of decision-making, fairness has become an important issue. More specifically, in many real-world situations, decision-makers can unintentionally discriminate a certain group of individuals based on either inherited or appropriated attributes, such as gender, age, race, or religion. In this paper, we introduce a post-processing technique, called fair additive weighting (FairAW) for achieving group and individual fairness in multi-criteria decision-making methods. The methodology is based on changing the score of an alternative by imposing fair criteria weights. This is achieved through minimization of differences in scores of individuals subject to fairness constraint. The proposed methodology can be successfully used in multi-criteria decision-making methods where the additive weighting is used to evaluate scores of individuals. Moreover, we tested the method both on synthetic and real-world data, and compared it to Disparate Impact Remover and FA*IR methods that are commonly used in achieving fair scoring of individuals. The obtained results showed that FairAW manages to achieve group fairness in terms of statistical parity, while also retaining individual fairness. Additionally, our approach managed to obtain the best equality in scoring between discriminated and privileged groups.
期刊介绍:
Intelligent Data Analysis provides a forum for the examination of issues related to the research and applications of Artificial Intelligence techniques in data analysis across a variety of disciplines. These techniques include (but are not limited to): all areas of data visualization, data pre-processing (fusion, editing, transformation, filtering, sampling), data engineering, database mining techniques, tools and applications, use of domain knowledge in data analysis, big data applications, evolutionary algorithms, machine learning, neural nets, fuzzy logic, statistical pattern recognition, knowledge filtering, and post-processing. In particular, papers are preferred that discuss development of new AI related data analysis architectures, methodologies, and techniques and their applications to various domains.