反对人类内容审核:没有创伤的算法

IF 0.9 2区 哲学 Q4 ETHICS
Juan Espíndola
{"title":"反对人类内容审核:没有创伤的算法","authors":"Juan Espíndola","doi":"10.1111/japp.70024","DOIUrl":null,"url":null,"abstract":"<p>This paper explores the morality of human content moderation. It focuses on content moderation of Child Sexual Abuse Material (CSAM) as it takes place in commercial digital platforms, broadly understood. I select CSAM for examination because there is a widespread and uncontroversial consensus around the need to remove it, which furnishes the strongest possible argument for human content moderation. The paper makes the case that, even if we grant that social media platforms or chatbots are a valuable—or inevitable— force in current societies, and even if moderation plays an important role in protecting users and society more generally from the detrimental effects of these digital tools, it is far from clear that tasking humans to conduct such moderation is permissible without constraints, given the psychic toll of the practice on moderators. While a blanket prohibition of human moderation would be objectionable, given the benefits of the practice, it behooves us to identify the fundamental interests affected by the harms of human moderation, the obligations that platforms acquire to protect such interests, and the conditions under which their realization is in fact possible. I argue that the failure to comply with certain standards renders human moderation impermissible.</p>","PeriodicalId":47057,"journal":{"name":"Journal of Applied Philosophy","volume":"42 4","pages":"1285-1300"},"PeriodicalIF":0.9000,"publicationDate":"2025-06-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/japp.70024","citationCount":"0","resultStr":"{\"title\":\"Against Human Content Moderation: Algorithms without Trauma\",\"authors\":\"Juan Espíndola\",\"doi\":\"10.1111/japp.70024\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>This paper explores the morality of human content moderation. It focuses on content moderation of Child Sexual Abuse Material (CSAM) as it takes place in commercial digital platforms, broadly understood. I select CSAM for examination because there is a widespread and uncontroversial consensus around the need to remove it, which furnishes the strongest possible argument for human content moderation. The paper makes the case that, even if we grant that social media platforms or chatbots are a valuable—or inevitable— force in current societies, and even if moderation plays an important role in protecting users and society more generally from the detrimental effects of these digital tools, it is far from clear that tasking humans to conduct such moderation is permissible without constraints, given the psychic toll of the practice on moderators. While a blanket prohibition of human moderation would be objectionable, given the benefits of the practice, it behooves us to identify the fundamental interests affected by the harms of human moderation, the obligations that platforms acquire to protect such interests, and the conditions under which their realization is in fact possible. I argue that the failure to comply with certain standards renders human moderation impermissible.</p>\",\"PeriodicalId\":47057,\"journal\":{\"name\":\"Journal of Applied Philosophy\",\"volume\":\"42 4\",\"pages\":\"1285-1300\"},\"PeriodicalIF\":0.9000,\"publicationDate\":\"2025-06-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://onlinelibrary.wiley.com/doi/epdf/10.1111/japp.70024\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Applied Philosophy\",\"FirstCategoryId\":\"98\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1111/japp.70024\",\"RegionNum\":2,\"RegionCategory\":\"哲学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"ETHICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Applied Philosophy","FirstCategoryId":"98","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/japp.70024","RegionNum":2,"RegionCategory":"哲学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ETHICS","Score":null,"Total":0}
引用次数: 0

摘要

本文探讨了人类内容节制的道德性。它侧重于儿童性虐待材料(CSAM)的内容节制,因为它发生在商业数字平台上,被广泛理解。我之所以选择CSAM进行研究,是因为在删除它的必要性方面存在广泛而无争议的共识,这为人类内容审核提供了最有力的论据。这篇论文认为,即使我们承认社交媒体平台或聊天机器人是当前社会中有价值的——或不可避免的——力量,即使适度在保护用户和社会更广泛地免受这些数字工具的有害影响方面发挥着重要作用,考虑到这种做法对版主的心理伤害,我们也远不清楚是否允许不受限制地让人类进行这种适度。虽然全面禁止人的节制是令人反感的,但考虑到这种做法的好处,我们有必要确定受人的节制危害影响的根本利益,平台为保护这些利益而承担的义务,以及在什么条件下实现这些利益实际上是可能的。我认为,不遵守某些标准使人类的节制成为不可能的。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Against Human Content Moderation: Algorithms without Trauma

This paper explores the morality of human content moderation. It focuses on content moderation of Child Sexual Abuse Material (CSAM) as it takes place in commercial digital platforms, broadly understood. I select CSAM for examination because there is a widespread and uncontroversial consensus around the need to remove it, which furnishes the strongest possible argument for human content moderation. The paper makes the case that, even if we grant that social media platforms or chatbots are a valuable—or inevitable— force in current societies, and even if moderation plays an important role in protecting users and society more generally from the detrimental effects of these digital tools, it is far from clear that tasking humans to conduct such moderation is permissible without constraints, given the psychic toll of the practice on moderators. While a blanket prohibition of human moderation would be objectionable, given the benefits of the practice, it behooves us to identify the fundamental interests affected by the harms of human moderation, the obligations that platforms acquire to protect such interests, and the conditions under which their realization is in fact possible. I argue that the failure to comply with certain standards renders human moderation impermissible.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
2.20
自引率
0.00%
发文量
71
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信