The Case for Establishing a Collective Perspective to Address the Harms of Platform Personalization

Ayelet Gordon-Tapiero, Alexandra Wood, Katrina Ligett
{"title":"The Case for Establishing a Collective Perspective to Address the Harms of Platform Personalization","authors":"Ayelet Gordon-Tapiero, Alexandra Wood, Katrina Ligett","doi":"10.1145/3511265.3550450","DOIUrl":null,"url":null,"abstract":"Personalization on digital platforms drives a broad range of harms, including misinformation, manipulation, social polarization, subversion of autonomy, and discrimination. In recent years, policymakers, civil society advocates, and researchers have proposed a wide range of interventions to address these challenges. In this article, we argue that the emerging toolkit reflects an individualistic view of both personal data and data-driven harms that will likely be inadequate to address growing harms in the global data ecosystem. We maintain that interventions must be grounded in an understanding of the fundamentally collective nature of data, wherein platforms leverage complex patterns of behaviors and characteristics observed across a large population to draw inferences and make predictions about individuals. Using the lens of the collective nature of data, we evaluate various approaches to addressing personalization-driven harms currently under consideration. This lens also allows us to frame concrete guidance for future legislation in this space and advocate meaningful transparency that goes far beyond current proposals. We offer a roadmap for what meaningful transparency must constitute: a collective perspective providing a third party with ongoing insight into the information gathered and observed about individuals and how it correlates with any personalized content they receive-across a large, representative population. These insights would enable the third party to understand, identify, quantify, and address cases of personalization-driven harms. We discuss how such transparency can be achieved without sacrificing privacy and provide guidelines for legislation to support the development of this proposal.","PeriodicalId":254114,"journal":{"name":"Proceedings of the 2022 Symposium on Computer Science and Law","volume":"6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2022 Symposium on Computer Science and Law","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3511265.3550450","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

Personalization on digital platforms drives a broad range of harms, including misinformation, manipulation, social polarization, subversion of autonomy, and discrimination. In recent years, policymakers, civil society advocates, and researchers have proposed a wide range of interventions to address these challenges. In this article, we argue that the emerging toolkit reflects an individualistic view of both personal data and data-driven harms that will likely be inadequate to address growing harms in the global data ecosystem. We maintain that interventions must be grounded in an understanding of the fundamentally collective nature of data, wherein platforms leverage complex patterns of behaviors and characteristics observed across a large population to draw inferences and make predictions about individuals. Using the lens of the collective nature of data, we evaluate various approaches to addressing personalization-driven harms currently under consideration. This lens also allows us to frame concrete guidance for future legislation in this space and advocate meaningful transparency that goes far beyond current proposals. We offer a roadmap for what meaningful transparency must constitute: a collective perspective providing a third party with ongoing insight into the information gathered and observed about individuals and how it correlates with any personalized content they receive-across a large, representative population. These insights would enable the third party to understand, identify, quantify, and address cases of personalization-driven harms. We discuss how such transparency can be achieved without sacrificing privacy and provide guidelines for legislation to support the development of this proposal.
建立一个集体视角来解决平台个性化的危害
数字平台上的个性化带来了广泛的危害,包括错误信息、操纵、社会两极分化、颠覆自治和歧视。近年来,决策者、公民社会倡导者和研究人员提出了广泛的干预措施来应对这些挑战。在本文中,我们认为,新兴的工具包反映了个人数据和数据驱动危害的个人主义观点,这可能不足以解决全球数据生态系统中日益严重的危害。我们认为,干预必须基于对数据的基本集体性质的理解,其中平台利用在大量人群中观察到的复杂行为模式和特征来推断和预测个体。使用数据的集体性质的镜头,我们评估各种方法来解决目前正在考虑的个性化驱动的危害。这一视角还使我们能够为这一领域的未来立法制定具体指导方针,并倡导远远超出当前建议的有意义的透明度。我们提供了一个路线图,说明有意义的透明度必须构成什么:一个集体的视角,为第三方提供对收集和观察到的个人信息的持续洞察,以及这些信息如何与他们收到的任何个性化内容相关联——在一个庞大的、有代表性的人群中。这些见解将使第三方能够理解、识别、量化和解决个性化驱动的危害案例。我们将讨论如何在不牺牲隐私的情况下实现这种透明度,并为支持这项建议的制定提供立法指导方针。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信