Opaque algorithms, transparent biases: Automated content moderation during the Sheikh Jarrah Crisis

Q2 Computer Science
Norah Abokhodair, Yarden Skop, Sarah Rüller, Konstantin Aal, Houda Elmimouni
{"title":"Opaque algorithms, transparent biases: Automated content moderation during the Sheikh Jarrah Crisis","authors":"Norah Abokhodair, Yarden Skop, Sarah Rüller, Konstantin Aal, Houda Elmimouni","doi":"10.5210/fm.v29i4.13620","DOIUrl":null,"url":null,"abstract":"Social media platforms, while influential tools for human rights activism, free speech, and mobilization, also bear the influence of corporate ownership and commercial interests. This dual character can lead to clashing interests in the operations of these platforms. This study centers on the May 2021 Sheikh Jarrah events in East Jerusalem, a focal point in the Israeli-Palestinian conflict that garnered global attention. During this period, Palestinian activists and their allies observed and encountered a notable increase in automated content moderation actions, like shadow banning and content removal. We surveyed 201 users who faced content moderation and conducted 12 interviews with political influencers to assess the impact of these practices on activism. Our analysis centers on automated content moderation and transparency, investigating how users and activists perceive the content moderation systems employed by social media platforms, and their opacity. Findings reveal perceived censorship by pro-Palestinian activists due to opaque and obfuscated technological mechanisms of content demotion, complicating harm substantiation and lack of redress mechanisms. We view this difficulty as part of algorithmic harms, in the realm of automated content moderation. This dynamic has far-reaching implications for activism’s future and it raises questions about power centralization in digital spaces.","PeriodicalId":38833,"journal":{"name":"First Monday","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-04-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"First Monday","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5210/fm.v29i4.13620","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Computer Science","Score":null,"Total":0}
引用次数: 0

Abstract

Social media platforms, while influential tools for human rights activism, free speech, and mobilization, also bear the influence of corporate ownership and commercial interests. This dual character can lead to clashing interests in the operations of these platforms. This study centers on the May 2021 Sheikh Jarrah events in East Jerusalem, a focal point in the Israeli-Palestinian conflict that garnered global attention. During this period, Palestinian activists and their allies observed and encountered a notable increase in automated content moderation actions, like shadow banning and content removal. We surveyed 201 users who faced content moderation and conducted 12 interviews with political influencers to assess the impact of these practices on activism. Our analysis centers on automated content moderation and transparency, investigating how users and activists perceive the content moderation systems employed by social media platforms, and their opacity. Findings reveal perceived censorship by pro-Palestinian activists due to opaque and obfuscated technological mechanisms of content demotion, complicating harm substantiation and lack of redress mechanisms. We view this difficulty as part of algorithmic harms, in the realm of automated content moderation. This dynamic has far-reaching implications for activism’s future and it raises questions about power centralization in digital spaces.
不透明的算法,透明的偏见:谢赫贾拉赫危机期间的自动内容审核
社交媒体平台虽然是人权活动、言论自由和动员的重要工具,但也受到企业所有权和商业利益的影响。这种双重特性会导致这些平台在运营过程中出现利益冲突。本研究以 2021 年 5 月在东耶路撒冷发生的 Sheikh Jarrah 事件为中心,该事件是全球瞩目的巴以冲突焦点。在此期间,巴勒斯坦活动家及其盟友观察到并遭遇了明显增加的自动内容审核行为,如影子封禁和内容删除。我们对 201 名面临内容审核的用户进行了调查,并对 12 名具有政治影响力的人士进行了访谈,以评估这些做法对激进主义的影响。我们的分析以自动内容审核和透明度为中心,调查用户和活动家如何看待社交媒体平台采用的内容审核系统及其不透明性。研究结果表明,由于内容降级技术机制的不透明和模糊性,亲巴勒斯坦活动家认为自己受到了审查,这使得危害证实和补救机制的缺乏变得更加复杂。我们将这一困难视为算法伤害的一部分,属于自动内容审核的范畴。这种态势对激进主义的未来有着深远的影响,并提出了数字空间权力集中化的问题。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
First Monday
First Monday Computer Science-Computer Networks and Communications
CiteScore
2.20
自引率
0.00%
发文量
86
期刊介绍: First Monday is one of the first openly accessible, peer–reviewed journals on the Internet, solely devoted to the Internet. Since its start in May 1996, First Monday has published 1,035 papers in 164 issues; these papers were written by 1,316 different authors. In addition, eight special issues have appeared. The most recent special issue was entitled A Web site with a view — The Third World on First Monday and it was edited by Eduardo Villanueva Mansilla. First Monday is indexed in Communication Abstracts, Computer & Communications Security Abstracts, DoIS, eGranary Digital Library, INSPEC, Information Science & Technology Abstracts, LISA, PAIS, and other services.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信