紧急歧视:我们应该保护算法群体吗?

IF 0.9 2区 哲学 Q4 ETHICS
Jannik Zeiser
{"title":"紧急歧视:我们应该保护算法群体吗?","authors":"Jannik Zeiser","doi":"10.1111/japp.12793","DOIUrl":null,"url":null,"abstract":"<p>Discrimination is usually thought of in terms of socially salient groups, such as race or gender. Some scholars argue that the rise of algorithmic decision-making poses a challenge to this notion. Algorithms are not bound by a social view of the world. Therefore, they may not only inherit pre-existing social biases and injustices but may also discriminate based on entirely new categories that have little or no meaning to humans at all, such as ‘being born on a Tuesday’. Should this prospect change how we theorize about discrimination, and should we protect these <i>algorithmic groups</i>, as some have suggested? I argue that the phenomenon is adequately described as ‘discrimination’ when a group is <i>systematically</i> disadvantaged. At present, we lack information about whether any algorithmic group meets this criterion, so it is difficult to protect such groups. Instead, we should <i>prevent</i> algorithms from disproportionately disadvantaging certain individuals, and I outline strategies for doing so.</p>","PeriodicalId":47057,"journal":{"name":"Journal of Applied Philosophy","volume":"42 3","pages":"910-928"},"PeriodicalIF":0.9000,"publicationDate":"2025-02-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/japp.12793","citationCount":"0","resultStr":"{\"title\":\"Emergent Discrimination: Should We Protect Algorithmic Groups?\",\"authors\":\"Jannik Zeiser\",\"doi\":\"10.1111/japp.12793\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Discrimination is usually thought of in terms of socially salient groups, such as race or gender. Some scholars argue that the rise of algorithmic decision-making poses a challenge to this notion. Algorithms are not bound by a social view of the world. Therefore, they may not only inherit pre-existing social biases and injustices but may also discriminate based on entirely new categories that have little or no meaning to humans at all, such as ‘being born on a Tuesday’. Should this prospect change how we theorize about discrimination, and should we protect these <i>algorithmic groups</i>, as some have suggested? I argue that the phenomenon is adequately described as ‘discrimination’ when a group is <i>systematically</i> disadvantaged. At present, we lack information about whether any algorithmic group meets this criterion, so it is difficult to protect such groups. Instead, we should <i>prevent</i> algorithms from disproportionately disadvantaging certain individuals, and I outline strategies for doing so.</p>\",\"PeriodicalId\":47057,\"journal\":{\"name\":\"Journal of Applied Philosophy\",\"volume\":\"42 3\",\"pages\":\"910-928\"},\"PeriodicalIF\":0.9000,\"publicationDate\":\"2025-02-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://onlinelibrary.wiley.com/doi/epdf/10.1111/japp.12793\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Applied Philosophy\",\"FirstCategoryId\":\"98\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1111/japp.12793\",\"RegionNum\":2,\"RegionCategory\":\"哲学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"ETHICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Applied Philosophy","FirstCategoryId":"98","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/japp.12793","RegionNum":2,"RegionCategory":"哲学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ETHICS","Score":null,"Total":0}
引用次数: 0

摘要

歧视通常被认为是针对社会上突出的群体,如种族或性别。一些学者认为,算法决策的兴起对这一概念构成了挑战。算法不受社会世界观的约束。因此,他们可能不仅继承了先前存在的社会偏见和不公正,还可能基于对人类几乎没有意义的全新类别进行歧视,例如“出生在星期二”。这种前景是否应该改变我们对歧视的理论化,我们是否应该像一些人建议的那样保护这些算法群体?我认为,当一个群体在系统上处于不利地位时,这种现象被恰当地描述为“歧视”。目前,我们缺乏关于是否有算法组满足这一标准的信息,因此很难保护这样的组。相反,我们应该防止算法不成比例地使某些人处于不利地位,我将概述这样做的策略。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Emergent Discrimination: Should We Protect Algorithmic Groups?

Discrimination is usually thought of in terms of socially salient groups, such as race or gender. Some scholars argue that the rise of algorithmic decision-making poses a challenge to this notion. Algorithms are not bound by a social view of the world. Therefore, they may not only inherit pre-existing social biases and injustices but may also discriminate based on entirely new categories that have little or no meaning to humans at all, such as ‘being born on a Tuesday’. Should this prospect change how we theorize about discrimination, and should we protect these algorithmic groups, as some have suggested? I argue that the phenomenon is adequately described as ‘discrimination’ when a group is systematically disadvantaged. At present, we lack information about whether any algorithmic group meets this criterion, so it is difficult to protect such groups. Instead, we should prevent algorithms from disproportionately disadvantaging certain individuals, and I outline strategies for doing so.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
2.20
自引率
0.00%
发文量
71
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信