{"title":"机器学习算法对黑人社区的公平性","authors":"S. M. A. Kiemde, A. Kora","doi":"10.1109/ISTAS50296.2020.9462194","DOIUrl":null,"url":null,"abstract":"This paper seeks to study the limits of the definitions of algorithmic fairness in relation to a protected variable, namely skin color. Discrimination towards the black community have existed for a long time. ML algorithms have only amplified or revealed existing discrimination. AI is a mirror that reflects the reality of our societies. The lack of a universal definition of algorithmic fairness makes it difficult to detect cases of discrimination in machine learning algorithms. We believe that independent or sensitive variables such as skin color are benchmarks that could be used to decide whether or not a decision is fair. We also recommend avoiding the use of proxy data.","PeriodicalId":196560,"journal":{"name":"2020 IEEE International Symposium on Technology and Society (ISTAS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Fairness of Machine Learning Algorithms for the Black Community\",\"authors\":\"S. M. A. Kiemde, A. Kora\",\"doi\":\"10.1109/ISTAS50296.2020.9462194\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper seeks to study the limits of the definitions of algorithmic fairness in relation to a protected variable, namely skin color. Discrimination towards the black community have existed for a long time. ML algorithms have only amplified or revealed existing discrimination. AI is a mirror that reflects the reality of our societies. The lack of a universal definition of algorithmic fairness makes it difficult to detect cases of discrimination in machine learning algorithms. We believe that independent or sensitive variables such as skin color are benchmarks that could be used to decide whether or not a decision is fair. We also recommend avoiding the use of proxy data.\",\"PeriodicalId\":196560,\"journal\":{\"name\":\"2020 IEEE International Symposium on Technology and Society (ISTAS)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-11-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 IEEE International Symposium on Technology and Society (ISTAS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISTAS50296.2020.9462194\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE International Symposium on Technology and Society (ISTAS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISTAS50296.2020.9462194","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Fairness of Machine Learning Algorithms for the Black Community
This paper seeks to study the limits of the definitions of algorithmic fairness in relation to a protected variable, namely skin color. Discrimination towards the black community have existed for a long time. ML algorithms have only amplified or revealed existing discrimination. AI is a mirror that reflects the reality of our societies. The lack of a universal definition of algorithmic fairness makes it difficult to detect cases of discrimination in machine learning algorithms. We believe that independent or sensitive variables such as skin color are benchmarks that could be used to decide whether or not a decision is fair. We also recommend avoiding the use of proxy data.