{"title":"\"总有办法绕过准则\":TikTok 上的非自杀性自残与内容审核","authors":"Valerie Lookingbill, Kimanh Le","doi":"10.1177/20563051241254371","DOIUrl":null,"url":null,"abstract":"The stigmatized nature of nonsuicidal self-injury may render TikTok, a short-form, video-sharing social media platform, appealing to individuals who engage in this behavior. Since this community faces biased scrutiny based on stigmatization surrounding mental health, nonsuicidal self-injury users may turn to TikTok, which offers a space for users to engage in discussions of nonsuicidal self-injury, exchange social support, experience validation with little fear of stigmatization, and facilitate harm reduction strategies. While TikTok’s Community Guidelines permit users to share personal experiences with mental health topics, TikTok explicitly bans content that shows, promotes, or shares plans for self-harm. As such, TikTok may moderate user-generated content, leading to exclusion and marginalization in this digital space. Through semi-structured interviews with 8 TikTok users and a content analysis of 150 TikTok videos, we explore how users with a history of nonsuicidal self-injury experience TikTok’s algorithm to engage with content on nonsuicidal self-injury. Findings demonstrate that users understand how to circumnavigate TikTok’s algorithm through hashtags, signaling, and algospeak to maintain visibility while also circumnavigating algorithmic detection on the platform. Furthermore, findings emphasize that users actively engage in self-surveillance, self-censorship, and self-policing to create a safe online community of care. Content moderation, however, can ultimately hinder progress toward the destigmatization of nonsuicidal self-injury.","PeriodicalId":47920,"journal":{"name":"Social Media + Society","volume":"73 1","pages":""},"PeriodicalIF":5.5000,"publicationDate":"2024-05-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"“There’s Always a Way to Get Around the Guidelines”: Nonsuicidal Self-Injury and Content Moderation on TikTok\",\"authors\":\"Valerie Lookingbill, Kimanh Le\",\"doi\":\"10.1177/20563051241254371\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The stigmatized nature of nonsuicidal self-injury may render TikTok, a short-form, video-sharing social media platform, appealing to individuals who engage in this behavior. Since this community faces biased scrutiny based on stigmatization surrounding mental health, nonsuicidal self-injury users may turn to TikTok, which offers a space for users to engage in discussions of nonsuicidal self-injury, exchange social support, experience validation with little fear of stigmatization, and facilitate harm reduction strategies. While TikTok’s Community Guidelines permit users to share personal experiences with mental health topics, TikTok explicitly bans content that shows, promotes, or shares plans for self-harm. As such, TikTok may moderate user-generated content, leading to exclusion and marginalization in this digital space. Through semi-structured interviews with 8 TikTok users and a content analysis of 150 TikTok videos, we explore how users with a history of nonsuicidal self-injury experience TikTok’s algorithm to engage with content on nonsuicidal self-injury. Findings demonstrate that users understand how to circumnavigate TikTok’s algorithm through hashtags, signaling, and algospeak to maintain visibility while also circumnavigating algorithmic detection on the platform. Furthermore, findings emphasize that users actively engage in self-surveillance, self-censorship, and self-policing to create a safe online community of care. Content moderation, however, can ultimately hinder progress toward the destigmatization of nonsuicidal self-injury.\",\"PeriodicalId\":47920,\"journal\":{\"name\":\"Social Media + Society\",\"volume\":\"73 1\",\"pages\":\"\"},\"PeriodicalIF\":5.5000,\"publicationDate\":\"2024-05-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Social Media + Society\",\"FirstCategoryId\":\"98\",\"ListUrlMain\":\"https://doi.org/10.1177/20563051241254371\",\"RegionNum\":1,\"RegionCategory\":\"文学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMMUNICATION\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Social Media + Society","FirstCategoryId":"98","ListUrlMain":"https://doi.org/10.1177/20563051241254371","RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMMUNICATION","Score":null,"Total":0}
“There’s Always a Way to Get Around the Guidelines”: Nonsuicidal Self-Injury and Content Moderation on TikTok
The stigmatized nature of nonsuicidal self-injury may render TikTok, a short-form, video-sharing social media platform, appealing to individuals who engage in this behavior. Since this community faces biased scrutiny based on stigmatization surrounding mental health, nonsuicidal self-injury users may turn to TikTok, which offers a space for users to engage in discussions of nonsuicidal self-injury, exchange social support, experience validation with little fear of stigmatization, and facilitate harm reduction strategies. While TikTok’s Community Guidelines permit users to share personal experiences with mental health topics, TikTok explicitly bans content that shows, promotes, or shares plans for self-harm. As such, TikTok may moderate user-generated content, leading to exclusion and marginalization in this digital space. Through semi-structured interviews with 8 TikTok users and a content analysis of 150 TikTok videos, we explore how users with a history of nonsuicidal self-injury experience TikTok’s algorithm to engage with content on nonsuicidal self-injury. Findings demonstrate that users understand how to circumnavigate TikTok’s algorithm through hashtags, signaling, and algospeak to maintain visibility while also circumnavigating algorithmic detection on the platform. Furthermore, findings emphasize that users actively engage in self-surveillance, self-censorship, and self-policing to create a safe online community of care. Content moderation, however, can ultimately hinder progress toward the destigmatization of nonsuicidal self-injury.
期刊介绍:
Social Media + Society is an open access, peer-reviewed scholarly journal that focuses on the socio-cultural, political, psychological, historical, economic, legal and policy dimensions of social media in societies past, contemporary and future. We publish interdisciplinary work that draws from the social sciences, humanities and computational social sciences, reaches out to the arts and natural sciences, and we endorse mixed methods and methodologies. The journal is open to a diversity of theoretic paradigms and methodologies. The editorial vision of Social Media + Society draws inspiration from research on social media to outline a field of study poised to reflexively grow as social technologies evolve. We foster the open access of sharing of research on the social properties of media, as they manifest themselves through the uses people make of networked platforms past and present, digital and non. The journal presents a collaborative, open, and shared space, dedicated exclusively to the study of social media and their implications for societies. It facilitates state-of-the-art research on cutting-edge trends and allows scholars to focus and track trends specific to this field of study.