{"title":"胜任第三方与平台内容审核:治理结构视角下独立决策主体的潜力","authors":"A. Heldt, Stephan Dreyer","doi":"10.5325/jinfopoli.11.2021.0266","DOIUrl":null,"url":null,"abstract":"After many years of much-criticized opacity in the field of content moderation, social media platforms are now opening up to a dialogue with users and policymakers. Until now, liability frameworks in the United States and in the European Union (EU) have set incentives for platforms not to monitor user-generated content—an increasingly contested model that has led to (inter alia) practices and policies of noncontainment. Following discussions on platform power over online speech and how contentious content benefits the attention economy, there is an observable shift toward stricter content moderation duties in addition to more responsibility with regard to content. Nevertheless, much remains unsolved: the legitimacy of platforms’ content moderation rules and decisions is still questioned. The platforms’ power over the vast majority of communication in the digital sphere is still difficult to grasp because of its nature as private, yet often perceived as public. To address this issue, we use a governance structure perspective to identify potential regulatory advantages of establishing cross-platform external bodies for content moderation, ultimately aiming at providing insights about the opportunities and limitations of such a model.","PeriodicalId":55617,"journal":{"name":"Journal of Information Policy","volume":null,"pages":null},"PeriodicalIF":1.0000,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Competent Third Parties and Content Moderation on Platforms: Potentials of Independent Decision-Making Bodies From A Governance Structure Perspective\",\"authors\":\"A. Heldt, Stephan Dreyer\",\"doi\":\"10.5325/jinfopoli.11.2021.0266\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"After many years of much-criticized opacity in the field of content moderation, social media platforms are now opening up to a dialogue with users and policymakers. Until now, liability frameworks in the United States and in the European Union (EU) have set incentives for platforms not to monitor user-generated content—an increasingly contested model that has led to (inter alia) practices and policies of noncontainment. Following discussions on platform power over online speech and how contentious content benefits the attention economy, there is an observable shift toward stricter content moderation duties in addition to more responsibility with regard to content. Nevertheless, much remains unsolved: the legitimacy of platforms’ content moderation rules and decisions is still questioned. The platforms’ power over the vast majority of communication in the digital sphere is still difficult to grasp because of its nature as private, yet often perceived as public. To address this issue, we use a governance structure perspective to identify potential regulatory advantages of establishing cross-platform external bodies for content moderation, ultimately aiming at providing insights about the opportunities and limitations of such a model.\",\"PeriodicalId\":55617,\"journal\":{\"name\":\"Journal of Information Policy\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.0000,\"publicationDate\":\"2021-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Information Policy\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.5325/jinfopoli.11.2021.0266\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMMUNICATION\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Information Policy","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5325/jinfopoli.11.2021.0266","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMMUNICATION","Score":null,"Total":0}
Competent Third Parties and Content Moderation on Platforms: Potentials of Independent Decision-Making Bodies From A Governance Structure Perspective
After many years of much-criticized opacity in the field of content moderation, social media platforms are now opening up to a dialogue with users and policymakers. Until now, liability frameworks in the United States and in the European Union (EU) have set incentives for platforms not to monitor user-generated content—an increasingly contested model that has led to (inter alia) practices and policies of noncontainment. Following discussions on platform power over online speech and how contentious content benefits the attention economy, there is an observable shift toward stricter content moderation duties in addition to more responsibility with regard to content. Nevertheless, much remains unsolved: the legitimacy of platforms’ content moderation rules and decisions is still questioned. The platforms’ power over the vast majority of communication in the digital sphere is still difficult to grasp because of its nature as private, yet often perceived as public. To address this issue, we use a governance structure perspective to identify potential regulatory advantages of establishing cross-platform external bodies for content moderation, ultimately aiming at providing insights about the opportunities and limitations of such a model.