Navigating the gray areas of content moderation: Professional moderators’ perspectives on uncivil user comments and the role of (AI-based) technological tools
{"title":"Navigating the gray areas of content moderation: Professional moderators’ perspectives on uncivil user comments and the role of (AI-based) technological tools","authors":"Andrea Stockinger, Svenja Schäfer, S. Lecheler","doi":"10.1177/14614448231190901","DOIUrl":null,"url":null,"abstract":"Professional content moderators are responsible for limiting the negative effects of online discussions on news platforms and social media. However, little is known about how they adjust to platform and company moderation strategies while viewing and dealing with uncivil comments. Using qualitative interviews ( N = 18), this study examines which types of comments professional moderators classify as actionable, which (automated) strategies they use to moderate them, and how these perceptions and strategies differ between organizations, platforms, and individuals. Our results show that moderators divide content requiring intervention into clearly problematic and “gray area” comments. They (automatically) delete clear cases but use interactive or motivational moderation techniques for “gray areas.” While moderators crave more advanced technologies, they deem them incapable of addressing context-heavy comments. These findings highlight the need for nuanced regulations, emphasize the crucial role of moderators in shaping public discourse, and offer practical implications for (semi-)automated content moderation strategies.","PeriodicalId":443328,"journal":{"name":"New Media & Society","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-08-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"New Media & Society","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1177/14614448231190901","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Professional content moderators are responsible for limiting the negative effects of online discussions on news platforms and social media. However, little is known about how they adjust to platform and company moderation strategies while viewing and dealing with uncivil comments. Using qualitative interviews ( N = 18), this study examines which types of comments professional moderators classify as actionable, which (automated) strategies they use to moderate them, and how these perceptions and strategies differ between organizations, platforms, and individuals. Our results show that moderators divide content requiring intervention into clearly problematic and “gray area” comments. They (automatically) delete clear cases but use interactive or motivational moderation techniques for “gray areas.” While moderators crave more advanced technologies, they deem them incapable of addressing context-heavy comments. These findings highlight the need for nuanced regulations, emphasize the crucial role of moderators in shaping public discourse, and offer practical implications for (semi-)automated content moderation strategies.