{"title":"The new frontier of platform policy","authors":"Matthew Marinett","doi":"10.14763/2021.3.1570","DOIUrl":null,"url":null,"abstract":"Platform policies aimed at the misbehaviour of users that occurs off of the platform, especially offline abuse, are a relatively new and understudied phenomenon that may represent a new frontier of platform policy. Policies of this nature may be necessary to create healthy online communities, but they raise unique problems in comparison to on-platform content moderation that exacerbate existing concerns about the accountability and transparency of platforms. \n \nThis article provides the background and context for the development of such policies through the case study of Twitch.tv. It then discusses three unique challenges raised by the creation and enforcement of policies aimed at off-platform abuse. These are 1) the need for investigation and verifying evidence, 2) the difficulty of balancing the privacy of all parties with a fair process, and 3) the increased potential adverse impacts of error. It argues that current policies are opaque and offer few guarantees to either complainants or the targets of complaints. Further steps, especially greater transparency and broader consultation, must be taken to ensure accountability to users and the public when such policies are implemented. \n \nIt concludes by discussing the possibility that similar policies will be adopted by other platforms and calls for greater public and academic discussion of where, when, and how such policies should be implemented.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"168 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Internet Policy Rev.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.14763/2021.3.1570","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Platform policies aimed at the misbehaviour of users that occurs off of the platform, especially offline abuse, are a relatively new and understudied phenomenon that may represent a new frontier of platform policy. Policies of this nature may be necessary to create healthy online communities, but they raise unique problems in comparison to on-platform content moderation that exacerbate existing concerns about the accountability and transparency of platforms.
This article provides the background and context for the development of such policies through the case study of Twitch.tv. It then discusses three unique challenges raised by the creation and enforcement of policies aimed at off-platform abuse. These are 1) the need for investigation and verifying evidence, 2) the difficulty of balancing the privacy of all parties with a fair process, and 3) the increased potential adverse impacts of error. It argues that current policies are opaque and offer few guarantees to either complainants or the targets of complaints. Further steps, especially greater transparency and broader consultation, must be taken to ensure accountability to users and the public when such policies are implemented.
It concludes by discussing the possibility that similar policies will be adopted by other platforms and calls for greater public and academic discussion of where, when, and how such policies should be implemented.