算法、成瘾和青少年心理健康:一项跨学科研究:为州一级保护青少年免受社交媒体危害的政策行动提供信息。

IF 0.5 4区 社会学 Q3 LAW
American Journal of Law & Medicine Pub Date : 2023-07-01 Epub Date: 2024-02-12 DOI:10.1017/amj.2023.25
Nancy Costello, Rebecca Sutton, Madeline Jones, Mackenzie Almassian, Amanda Raffoul, Oluwadunni Ojumu, Meg Salvia, Monique Santoso, Jill R Kavanaugh, S Bryn Austin
{"title":"算法、成瘾和青少年心理健康:一项跨学科研究:为州一级保护青少年免受社交媒体危害的政策行动提供信息。","authors":"Nancy Costello, Rebecca Sutton, Madeline Jones, Mackenzie Almassian, Amanda Raffoul, Oluwadunni Ojumu, Meg Salvia, Monique Santoso, Jill R Kavanaugh, S Bryn Austin","doi":"10.1017/amj.2023.25","DOIUrl":null,"url":null,"abstract":"<p><p>A recent Wall Street Journal investigation revealed that TikTok floods child and adolescent users with videos of rapid weight loss methods, including tips on how to consume less than 300 calories a day and promoting a \"corpse bride diet,\" showing emaciated girls with protruding bones. The investigation involved the creation of a dozen automated accounts registered as 13-year-olds and revealed that TikTok algorithms fed adolescents tens of thousands of weight-loss videos within just a few weeks of joining the platform. Emerging research indicates that these practices extend well beyond TikTok to other social media platforms that engage millions of U.S. youth on a daily basis.Social media algorithms that push extreme content to vulnerable youth are linked to an increase in mental health problems for adolescents, including poor body image, eating disorders, and suicidality. Policy measures must be taken to curb this harmful practice. The Strategic Training Initiative for the Prevention of Eating Disorders (STRIPED), a research program based at the Harvard T.H. Chan School of Public Health and Boston Children's Hospital, has assembled a diverse team of scholars, including experts in public health, neuroscience, health economics, and law with specialization in First Amendment law, to study the harmful effects of social media algorithms, identify the economic incentives that drive social media companies to use them, and develop strategies that can be pursued to regulate social media platforms' use of algorithms. For our study, we have examined a critical mass of public health and neuroscience research demonstrating mental health harms to youth. We have conducted a groundbreaking economic study showing nearly $11 billion in advertising revenue is generated annually by social media platforms through advertisements targeted at users 0 to 17 years old, thus incentivizing platforms to continue their harmful practices. We have also examined legal strategies to address the regulation of social media platforms by conducting reviews of federal and state legal precedent and consulting with stakeholders in business regulation, technology, and federal and state government.While nationally the issue is being scrutinized by Congress and the Federal Trade Commission, quicker and more effective legal strategies that would survive constitutional scrutiny may be implemented by states, such as the Age Appropriate Design Code Act recently adopted in California, which sets standards that online services likely to be accessed by children must follow. Another avenue for regulation may be through states mandating that social media platforms submit to algorithm risk audits conducted by independent third parties and publicly disclose the results. Furthermore, Section 230 of the federal Communications Decency Act, which has long shielded social media platforms from liability for wrongful acts, may be circumvented if it is proven that social media companies share advertising revenues with content providers posting illegal or harmful content.Our research team's public health and economic findings combined with our legal analysis and resulting recommendations, provide innovative and viable policy actions that state lawmakers and attorneys general can take to protect youth from the harms of dangerous social media algorithms.</p>","PeriodicalId":7680,"journal":{"name":"American Journal of Law & Medicine","volume":"49 2-3","pages":"135-172"},"PeriodicalIF":0.5000,"publicationDate":"2023-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"ALGORITHMS, ADDICTION, AND ADOLESCENT MENTAL HEALTH: An Interdisciplinary Study to Inform State-level Policy Action to Protect Youth from the Dangers of Social Media.\",\"authors\":\"Nancy Costello, Rebecca Sutton, Madeline Jones, Mackenzie Almassian, Amanda Raffoul, Oluwadunni Ojumu, Meg Salvia, Monique Santoso, Jill R Kavanaugh, S Bryn Austin\",\"doi\":\"10.1017/amj.2023.25\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>A recent Wall Street Journal investigation revealed that TikTok floods child and adolescent users with videos of rapid weight loss methods, including tips on how to consume less than 300 calories a day and promoting a \\\"corpse bride diet,\\\" showing emaciated girls with protruding bones. The investigation involved the creation of a dozen automated accounts registered as 13-year-olds and revealed that TikTok algorithms fed adolescents tens of thousands of weight-loss videos within just a few weeks of joining the platform. Emerging research indicates that these practices extend well beyond TikTok to other social media platforms that engage millions of U.S. youth on a daily basis.Social media algorithms that push extreme content to vulnerable youth are linked to an increase in mental health problems for adolescents, including poor body image, eating disorders, and suicidality. Policy measures must be taken to curb this harmful practice. The Strategic Training Initiative for the Prevention of Eating Disorders (STRIPED), a research program based at the Harvard T.H. Chan School of Public Health and Boston Children's Hospital, has assembled a diverse team of scholars, including experts in public health, neuroscience, health economics, and law with specialization in First Amendment law, to study the harmful effects of social media algorithms, identify the economic incentives that drive social media companies to use them, and develop strategies that can be pursued to regulate social media platforms' use of algorithms. For our study, we have examined a critical mass of public health and neuroscience research demonstrating mental health harms to youth. We have conducted a groundbreaking economic study showing nearly $11 billion in advertising revenue is generated annually by social media platforms through advertisements targeted at users 0 to 17 years old, thus incentivizing platforms to continue their harmful practices. We have also examined legal strategies to address the regulation of social media platforms by conducting reviews of federal and state legal precedent and consulting with stakeholders in business regulation, technology, and federal and state government.While nationally the issue is being scrutinized by Congress and the Federal Trade Commission, quicker and more effective legal strategies that would survive constitutional scrutiny may be implemented by states, such as the Age Appropriate Design Code Act recently adopted in California, which sets standards that online services likely to be accessed by children must follow. Another avenue for regulation may be through states mandating that social media platforms submit to algorithm risk audits conducted by independent third parties and publicly disclose the results. Furthermore, Section 230 of the federal Communications Decency Act, which has long shielded social media platforms from liability for wrongful acts, may be circumvented if it is proven that social media companies share advertising revenues with content providers posting illegal or harmful content.Our research team's public health and economic findings combined with our legal analysis and resulting recommendations, provide innovative and viable policy actions that state lawmakers and attorneys general can take to protect youth from the harms of dangerous social media algorithms.</p>\",\"PeriodicalId\":7680,\"journal\":{\"name\":\"American Journal of Law & Medicine\",\"volume\":\"49 2-3\",\"pages\":\"135-172\"},\"PeriodicalIF\":0.5000,\"publicationDate\":\"2023-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"American Journal of Law & Medicine\",\"FirstCategoryId\":\"90\",\"ListUrlMain\":\"https://doi.org/10.1017/amj.2023.25\",\"RegionNum\":4,\"RegionCategory\":\"社会学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2024/2/12 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q3\",\"JCRName\":\"LAW\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"American Journal of Law & Medicine","FirstCategoryId":"90","ListUrlMain":"https://doi.org/10.1017/amj.2023.25","RegionNum":4,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/2/12 0:00:00","PubModel":"Epub","JCR":"Q3","JCRName":"LAW","Score":null,"Total":0}
引用次数: 0

摘要

华尔街日报》最近的一项调查显示,TikTok 向儿童和青少年用户灌输快速减肥方法的视频,包括提示如何每天摄入不到 300 卡路里的热量,以及宣传 "尸体新娘减肥法",展示骨骼突出的憔悴女孩。调查涉及创建十几个注册为 13 岁青少年的自动账户,并发现 TikTok 算法在青少年加入平台后的短短几周内就向他们提供了数以万计的减肥视频。新近的研究表明,这些做法已经超出了 TikTok 的范围,延伸到了每天吸引数百万美国青少年的其他社交媒体平台。社交媒体算法向易受伤害的青少年推送极端内容与青少年心理健康问题的增加有关,包括不良身体形象、饮食失调和自杀。必须采取政策措施遏制这种有害做法。预防饮食失调战略培训计划(STRIPED)是哈佛大学陈博士公共卫生学院和波士顿儿童医院的一项研究计划,该计划组建了一个多元化的学者团队,其中包括公共卫生、神经科学、卫生经济学和法律方面的专家,特别是第一修正案法律方面的专家,以研究社交媒体算法的有害影响,确定促使社交媒体公司使用算法的经济动机,并制定可采取的策略来规范社交媒体平台对算法的使用。在我们的研究中,我们研究了大量的公共卫生和神经科学研究,这些研究证明了算法对青少年心理健康的危害。我们进行了一项开创性的经济研究,结果显示社交媒体平台每年通过针对 0 至 17 岁用户的广告获得近 110 亿美元的广告收入,从而激励平台继续其有害做法。我们还研究了解决社交媒体平台监管问题的法律策略,对联邦和各州的法律先例进行了审查,并咨询了商业监管、技术、联邦和州政府等领域的利益相关者。虽然在全国范围内,国会和联邦贸易委员会正在对这一问题进行审查,但各州可能会实施更快、更有效的法律策略,以经受住宪法审查,例如加利福尼亚州最近通过的《适龄设计规范法》,该法规定了可能被儿童访问的在线服务必须遵循的标准。另一个监管途径可能是各州强制要求社交媒体平台接受由独立第三方进行的算法风险审计,并公开披露审计结果。此外,如果证明社交媒体公司与发布非法或有害内容的内容提供商分享广告收入,那么长期以来一直保护社交媒体平台免于承担不法行为责任的联邦《通信体面法》第 230 条可能会被规避。我们研究团队的公共健康和经济研究结果与我们的法律分析和建议相结合,为各州立法者和总检察长提供了创新可行的政策措施,以保护青少年免受危险社交媒体算法的伤害。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
ALGORITHMS, ADDICTION, AND ADOLESCENT MENTAL HEALTH: An Interdisciplinary Study to Inform State-level Policy Action to Protect Youth from the Dangers of Social Media.

A recent Wall Street Journal investigation revealed that TikTok floods child and adolescent users with videos of rapid weight loss methods, including tips on how to consume less than 300 calories a day and promoting a "corpse bride diet," showing emaciated girls with protruding bones. The investigation involved the creation of a dozen automated accounts registered as 13-year-olds and revealed that TikTok algorithms fed adolescents tens of thousands of weight-loss videos within just a few weeks of joining the platform. Emerging research indicates that these practices extend well beyond TikTok to other social media platforms that engage millions of U.S. youth on a daily basis.Social media algorithms that push extreme content to vulnerable youth are linked to an increase in mental health problems for adolescents, including poor body image, eating disorders, and suicidality. Policy measures must be taken to curb this harmful practice. The Strategic Training Initiative for the Prevention of Eating Disorders (STRIPED), a research program based at the Harvard T.H. Chan School of Public Health and Boston Children's Hospital, has assembled a diverse team of scholars, including experts in public health, neuroscience, health economics, and law with specialization in First Amendment law, to study the harmful effects of social media algorithms, identify the economic incentives that drive social media companies to use them, and develop strategies that can be pursued to regulate social media platforms' use of algorithms. For our study, we have examined a critical mass of public health and neuroscience research demonstrating mental health harms to youth. We have conducted a groundbreaking economic study showing nearly $11 billion in advertising revenue is generated annually by social media platforms through advertisements targeted at users 0 to 17 years old, thus incentivizing platforms to continue their harmful practices. We have also examined legal strategies to address the regulation of social media platforms by conducting reviews of federal and state legal precedent and consulting with stakeholders in business regulation, technology, and federal and state government.While nationally the issue is being scrutinized by Congress and the Federal Trade Commission, quicker and more effective legal strategies that would survive constitutional scrutiny may be implemented by states, such as the Age Appropriate Design Code Act recently adopted in California, which sets standards that online services likely to be accessed by children must follow. Another avenue for regulation may be through states mandating that social media platforms submit to algorithm risk audits conducted by independent third parties and publicly disclose the results. Furthermore, Section 230 of the federal Communications Decency Act, which has long shielded social media platforms from liability for wrongful acts, may be circumvented if it is proven that social media companies share advertising revenues with content providers posting illegal or harmful content.Our research team's public health and economic findings combined with our legal analysis and resulting recommendations, provide innovative and viable policy actions that state lawmakers and attorneys general can take to protect youth from the harms of dangerous social media algorithms.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
0.80
自引率
16.70%
发文量
8
期刊介绍: desde Enero 2004 Último Numero: Octubre 2008 AJLM will solicit blind comments from expert peer reviewers, including faculty members of our editorial board, as well as from other preeminent health law and public policy academics and professionals from across the country and around the world.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信