平台内容审核的消费者保护方法

Mark MacCarthy
{"title":"平台内容审核的消费者保护方法","authors":"Mark MacCarthy","doi":"10.2139/SSRN.3408459","DOIUrl":null,"url":null,"abstract":"Congress should consider legislation to regulate the content moderation practices of platforms. Failure to act will leave platform users unprotected and will allow other countries, notably the European Union and China, to seize global leadership in yet another area of tech policy. But a law requiring content rules against the most salient kinds of harmful platform content including hate speech, terrorist material and disinformation campaigns would not pass constitutional muster under the First Amendment. In contrast a consumer protection approach to content moderation might be effective and pass First Amendment scrutiny. The Federal Trade Commission, on its own or with authorization from Congress, could treat the failure to establish and maintain a procedurally adequate content moderation program as an unfair practice. This would effectively require platforms to have a content moderation program in place that contains content rules, enforcement procedures and due process protections including disclosure, mechanisms to ask for reinstatement and an internal appeals process, but it would not mandate the substance of the platform’s content rules. It would respond to strict First Amendment scrutiny as a narrowly crafted requirement that burdens speech no more than necessary to achieve the compelling government purpose of preventing an unfair trade practice. In addition, or alternatively, the FTC might be authorized to use its deception authority to require platforms to say what they do and do what they say in connection with content moderation programs. The FTC would treat failure to disclose key elements of a content moderation program as a material omission, and the failure to act in accordance with its program as a deceptive or misleading practice. Its First Amendment defense would rest on the compelling government interest in preventing consumer deception. The unfairness version would be more effective but less likely to survive a constitutional challenge. The pure disclosure version would be less effective, but more likely to be found consistent with current First Amendment jurisprudence. One additional advantage of this consumer protection approach is that it does not require controversial modification of Section 230 immunities for platforms.","PeriodicalId":171535,"journal":{"name":"LSN: Rights & Liberties (Topic)","volume":"147 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"A Consumer Protection Approach to Platform Content Moderation\",\"authors\":\"Mark MacCarthy\",\"doi\":\"10.2139/SSRN.3408459\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Congress should consider legislation to regulate the content moderation practices of platforms. Failure to act will leave platform users unprotected and will allow other countries, notably the European Union and China, to seize global leadership in yet another area of tech policy. But a law requiring content rules against the most salient kinds of harmful platform content including hate speech, terrorist material and disinformation campaigns would not pass constitutional muster under the First Amendment. In contrast a consumer protection approach to content moderation might be effective and pass First Amendment scrutiny. The Federal Trade Commission, on its own or with authorization from Congress, could treat the failure to establish and maintain a procedurally adequate content moderation program as an unfair practice. This would effectively require platforms to have a content moderation program in place that contains content rules, enforcement procedures and due process protections including disclosure, mechanisms to ask for reinstatement and an internal appeals process, but it would not mandate the substance of the platform’s content rules. It would respond to strict First Amendment scrutiny as a narrowly crafted requirement that burdens speech no more than necessary to achieve the compelling government purpose of preventing an unfair trade practice. In addition, or alternatively, the FTC might be authorized to use its deception authority to require platforms to say what they do and do what they say in connection with content moderation programs. The FTC would treat failure to disclose key elements of a content moderation program as a material omission, and the failure to act in accordance with its program as a deceptive or misleading practice. Its First Amendment defense would rest on the compelling government interest in preventing consumer deception. The unfairness version would be more effective but less likely to survive a constitutional challenge. The pure disclosure version would be less effective, but more likely to be found consistent with current First Amendment jurisprudence. One additional advantage of this consumer protection approach is that it does not require controversial modification of Section 230 immunities for platforms.\",\"PeriodicalId\":171535,\"journal\":{\"name\":\"LSN: Rights & Liberties (Topic)\",\"volume\":\"147 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-06-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"LSN: Rights & Liberties (Topic)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.2139/SSRN.3408459\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"LSN: Rights & Liberties (Topic)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2139/SSRN.3408459","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

摘要

国会应该考虑立法来规范平台的内容审核做法。如果不采取行动,平台用户将得不到保护,并将允许其他国家,尤其是欧盟和中国,在另一个科技政策领域夺取全球领导地位。但是,根据美国宪法第一修正案,一项要求对最突出的有害平台内容(包括仇恨言论、恐怖主义材料和虚假宣传)进行内容规定的法律,将无法通过宪法审查。相比之下,对内容进行节制的消费者保护方法可能是有效的,并且可以通过第一修正案的审查。联邦贸易委员会(Federal Trade Commission)可以自行或经国会授权,将未能建立和维持程序上适当的内容审核计划视为不公平的做法。这将有效地要求平台有一个内容审核程序,其中包含内容规则、执行程序和正当程序保护,包括披露、要求恢复的机制和内部上诉程序,但它不会强制要求平台内容规则的实质内容。它将对第一修正案的严格审查作出回应,认为这是一种精心设计的要求,对言论的负担不超过实现防止不公平贸易行为这一强制性政府目的所必需的程度。此外,或者作为选择,联邦贸易委员会可能被授权使用其欺骗权力,要求平台说出他们所做的,并在内容审核程序方面做到他们所说的。联邦贸易委员会将把未能披露内容审核计划的关键要素视为重大遗漏,并将未能按照其计划行事视为欺骗或误导行为。它的第一修正案辩护将基于政府在防止消费者欺骗方面的迫切利益。不公平的版本会更有效,但不太可能在宪法挑战中幸存下来。纯粹的披露版本将不那么有效,但更有可能被发现与当前的第一修正案判例相一致。这种消费者保护方法的另一个优点是,它不需要对平台的第230条豁免进行有争议的修改。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A Consumer Protection Approach to Platform Content Moderation
Congress should consider legislation to regulate the content moderation practices of platforms. Failure to act will leave platform users unprotected and will allow other countries, notably the European Union and China, to seize global leadership in yet another area of tech policy. But a law requiring content rules against the most salient kinds of harmful platform content including hate speech, terrorist material and disinformation campaigns would not pass constitutional muster under the First Amendment. In contrast a consumer protection approach to content moderation might be effective and pass First Amendment scrutiny. The Federal Trade Commission, on its own or with authorization from Congress, could treat the failure to establish and maintain a procedurally adequate content moderation program as an unfair practice. This would effectively require platforms to have a content moderation program in place that contains content rules, enforcement procedures and due process protections including disclosure, mechanisms to ask for reinstatement and an internal appeals process, but it would not mandate the substance of the platform’s content rules. It would respond to strict First Amendment scrutiny as a narrowly crafted requirement that burdens speech no more than necessary to achieve the compelling government purpose of preventing an unfair trade practice. In addition, or alternatively, the FTC might be authorized to use its deception authority to require platforms to say what they do and do what they say in connection with content moderation programs. The FTC would treat failure to disclose key elements of a content moderation program as a material omission, and the failure to act in accordance with its program as a deceptive or misleading practice. Its First Amendment defense would rest on the compelling government interest in preventing consumer deception. The unfairness version would be more effective but less likely to survive a constitutional challenge. The pure disclosure version would be less effective, but more likely to be found consistent with current First Amendment jurisprudence. One additional advantage of this consumer protection approach is that it does not require controversial modification of Section 230 immunities for platforms.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信