审核在线儿童性虐待材料(CSAM):自我监管有效吗,还是需要更大的国家监管?

IF 2 3区 社会学 Q1 CRIMINOLOGY & PENOLOGY
P. Bleakley, Elena Martellozzo, R. Spence, Jeffrey DeMarco
{"title":"审核在线儿童性虐待材料(CSAM):自我监管有效吗,还是需要更大的国家监管?","authors":"P. Bleakley, Elena Martellozzo, R. Spence, Jeffrey DeMarco","doi":"10.1177/14773708231181361","DOIUrl":null,"url":null,"abstract":"Social media platforms are crucial public forums connecting users around the world through a decentralised cyberspace. These platforms host high volumes of content and, as such, employ content moderators (CMs) to safeguard users against harmful content like child sexual abuse material (CSAM). These roles are critical in the social media landscape however, CMs’ work as “digital first responders” is complicated by legal and systemic debates over whether the policing of cyberspace should be left to the self-regulation of technology companies, or if greater state-regulation is required. In this empirical policy and literature review, major debates in the area of content moderation and, in particular, the online policing of CSAM are identified and evaluated. This includes the issue of territorial jurisdiction, and how it obstructs traditional policing; concerns over free speech and privacy if CMs are given greater powers, and debates over whether technology companies should be legally liable for user-generated content (UGC). In outlining these issues, a more comprehensive foundation for evaluating current practices for monitoring and combatting online CSAM is established which illustrates both the practical and philosophical challenges of the existing status quo, wherein the state and private companies share these important responsibilities.","PeriodicalId":51475,"journal":{"name":"European Journal of Criminology","volume":null,"pages":null},"PeriodicalIF":2.0000,"publicationDate":"2023-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Moderating online child sexual abuse material (CSAM): Does self-regulation work, or is greater state regulation needed?\",\"authors\":\"P. Bleakley, Elena Martellozzo, R. Spence, Jeffrey DeMarco\",\"doi\":\"10.1177/14773708231181361\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Social media platforms are crucial public forums connecting users around the world through a decentralised cyberspace. These platforms host high volumes of content and, as such, employ content moderators (CMs) to safeguard users against harmful content like child sexual abuse material (CSAM). These roles are critical in the social media landscape however, CMs’ work as “digital first responders” is complicated by legal and systemic debates over whether the policing of cyberspace should be left to the self-regulation of technology companies, or if greater state-regulation is required. In this empirical policy and literature review, major debates in the area of content moderation and, in particular, the online policing of CSAM are identified and evaluated. This includes the issue of territorial jurisdiction, and how it obstructs traditional policing; concerns over free speech and privacy if CMs are given greater powers, and debates over whether technology companies should be legally liable for user-generated content (UGC). In outlining these issues, a more comprehensive foundation for evaluating current practices for monitoring and combatting online CSAM is established which illustrates both the practical and philosophical challenges of the existing status quo, wherein the state and private companies share these important responsibilities.\",\"PeriodicalId\":51475,\"journal\":{\"name\":\"European Journal of Criminology\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.0000,\"publicationDate\":\"2023-07-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"European Journal of Criminology\",\"FirstCategoryId\":\"90\",\"ListUrlMain\":\"https://doi.org/10.1177/14773708231181361\",\"RegionNum\":3,\"RegionCategory\":\"社会学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"CRIMINOLOGY & PENOLOGY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"European Journal of Criminology","FirstCategoryId":"90","ListUrlMain":"https://doi.org/10.1177/14773708231181361","RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"CRIMINOLOGY & PENOLOGY","Score":null,"Total":0}
引用次数: 0

摘要

社交媒体平台是通过去中心化的网络空间连接世界各地用户的重要公共论坛。这些平台承载大量内容,因此雇佣内容管理员(CM)来保护用户免受儿童性虐待材料(CSAM)等有害内容的侵害。这些角色在社交媒体领域至关重要。然而,CM作为“数字第一响应者”的工作因法律和系统性辩论而变得复杂,这些辩论涉及网络空间的监管是否应该留给科技公司的自我监管,或者是否需要更大的国家监管。在这篇实证政策和文献综述中,确定并评估了内容审核领域的主要辩论,特别是CSAM的在线警务。这包括领土管辖权问题,以及它如何阻碍传统警务;如果CM被赋予更大的权力,对言论自由和隐私的担忧,以及关于科技公司是否应对用户生成的内容(UGC)承担法律责任的争论。在概述这些问题时,为评估当前监控和打击在线CSAM的做法奠定了一个更全面的基础,这说明了现有现状的实际和哲学挑战,其中国有和私营公司分担这些重要责任。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Moderating online child sexual abuse material (CSAM): Does self-regulation work, or is greater state regulation needed?
Social media platforms are crucial public forums connecting users around the world through a decentralised cyberspace. These platforms host high volumes of content and, as such, employ content moderators (CMs) to safeguard users against harmful content like child sexual abuse material (CSAM). These roles are critical in the social media landscape however, CMs’ work as “digital first responders” is complicated by legal and systemic debates over whether the policing of cyberspace should be left to the self-regulation of technology companies, or if greater state-regulation is required. In this empirical policy and literature review, major debates in the area of content moderation and, in particular, the online policing of CSAM are identified and evaluated. This includes the issue of territorial jurisdiction, and how it obstructs traditional policing; concerns over free speech and privacy if CMs are given greater powers, and debates over whether technology companies should be legally liable for user-generated content (UGC). In outlining these issues, a more comprehensive foundation for evaluating current practices for monitoring and combatting online CSAM is established which illustrates both the practical and philosophical challenges of the existing status quo, wherein the state and private companies share these important responsibilities.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
European Journal of Criminology
European Journal of Criminology CRIMINOLOGY & PENOLOGY-
CiteScore
5.10
自引率
5.30%
发文量
28
期刊介绍: The European Journal of Criminology is a refereed journal published by SAGE publications and the European Society of Criminology. It provides a forum for research and scholarship on crime and criminal justice institutions. The journal published high quality articles using varied approaches, including discussion of theory, analysis of quantitative data, comparative studies, systematic evaluation of interventions, and study of institutions of political process. The journal also covers analysis of policy, but not description of policy developments. Priority is given to articles that are relevant to the wider Europe (within and beyond the EU) although findings may be drawn from other parts of the world.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信