Blake Hallinan, CJ Reynolds, Rebecca Scharlach, Dana Theiler, Noa Niv, Omer Rothenstein, Isabell Knief, Yehonatan Kuperberg
{"title":"信托和安全行业标准中的优先事项和排除事项","authors":"Blake Hallinan, CJ Reynolds, Rebecca Scharlach, Dana Theiler, Noa Niv, Omer Rothenstein, Isabell Knief, Yehonatan Kuperberg","doi":"10.1177/14614448251357225","DOIUrl":null,"url":null,"abstract":"Platform governance is simultaneously a matter of public concern and a professional calling for Trust and Safety, the nascent field tasked with setting and enforcing standards of acceptable behavior on digital platforms. Yet we know little about how the growing professionalization of platform governance shapes content moderation practices. Using the schema of content abuse from the Trust & Safety Professional Association, we analyzed the Community Guidelines of 12 diverse livestreaming platforms. Our findings reveal significant alignment between professional guidelines and industry practices, which is especially pronounced for Twitch and YouTube. However, industry standards only partially address the policies of adult camming platforms and AfreecaTV, a Korean-based livestreaming service, revealing notable absences in how Trust and Safety imagines the boundaries of the industry. We reflect on the priorities and exclusions of emerging industry standards and conclude with a call for academics and practitioners to broaden the conversation around content moderation.","PeriodicalId":19149,"journal":{"name":"New Media & Society","volume":"15 1","pages":""},"PeriodicalIF":4.5000,"publicationDate":"2025-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Priorities and exclusions within Trust and Safety industry standards\",\"authors\":\"Blake Hallinan, CJ Reynolds, Rebecca Scharlach, Dana Theiler, Noa Niv, Omer Rothenstein, Isabell Knief, Yehonatan Kuperberg\",\"doi\":\"10.1177/14614448251357225\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Platform governance is simultaneously a matter of public concern and a professional calling for Trust and Safety, the nascent field tasked with setting and enforcing standards of acceptable behavior on digital platforms. Yet we know little about how the growing professionalization of platform governance shapes content moderation practices. Using the schema of content abuse from the Trust & Safety Professional Association, we analyzed the Community Guidelines of 12 diverse livestreaming platforms. Our findings reveal significant alignment between professional guidelines and industry practices, which is especially pronounced for Twitch and YouTube. However, industry standards only partially address the policies of adult camming platforms and AfreecaTV, a Korean-based livestreaming service, revealing notable absences in how Trust and Safety imagines the boundaries of the industry. We reflect on the priorities and exclusions of emerging industry standards and conclude with a call for academics and practitioners to broaden the conversation around content moderation.\",\"PeriodicalId\":19149,\"journal\":{\"name\":\"New Media & Society\",\"volume\":\"15 1\",\"pages\":\"\"},\"PeriodicalIF\":4.5000,\"publicationDate\":\"2025-07-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"New Media & Society\",\"FirstCategoryId\":\"98\",\"ListUrlMain\":\"https://doi.org/10.1177/14614448251357225\",\"RegionNum\":1,\"RegionCategory\":\"文学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMMUNICATION\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"New Media & Society","FirstCategoryId":"98","ListUrlMain":"https://doi.org/10.1177/14614448251357225","RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMMUNICATION","Score":null,"Total":0}
Priorities and exclusions within Trust and Safety industry standards
Platform governance is simultaneously a matter of public concern and a professional calling for Trust and Safety, the nascent field tasked with setting and enforcing standards of acceptable behavior on digital platforms. Yet we know little about how the growing professionalization of platform governance shapes content moderation practices. Using the schema of content abuse from the Trust & Safety Professional Association, we analyzed the Community Guidelines of 12 diverse livestreaming platforms. Our findings reveal significant alignment between professional guidelines and industry practices, which is especially pronounced for Twitch and YouTube. However, industry standards only partially address the policies of adult camming platforms and AfreecaTV, a Korean-based livestreaming service, revealing notable absences in how Trust and Safety imagines the boundaries of the industry. We reflect on the priorities and exclusions of emerging industry standards and conclude with a call for academics and practitioners to broaden the conversation around content moderation.
期刊介绍:
New Media & Society engages in critical discussions of the key issues arising from the scale and speed of new media development, drawing on a wide range of disciplinary perspectives and on both theoretical and empirical research. The journal includes contributions on: -the individual and the social, the cultural and the political dimensions of new media -the global and local dimensions of the relationship between media and social change -contemporary as well as historical developments -the implications and impacts of, as well as the determinants and obstacles to, media change the relationship between theory, policy and practice.