Internet Policy Rev.最新文献

筛选
英文 中文
Feminist data protection: an introduction 女权主义数据保护:导论
Internet Policy Rev. Pub Date : 2021-12-07 DOI: 10.14763/2021.4.1609
Jens T. Theilen, A. Baur, Felix Bieker, R. Quinn, M. Hansen, G. G. Fuster
{"title":"Feminist data protection: an introduction","authors":"Jens T. Theilen, A. Baur, Felix Bieker, R. Quinn, M. Hansen, G. G. Fuster","doi":"10.14763/2021.4.1609","DOIUrl":"https://doi.org/10.14763/2021.4.1609","url":null,"abstract":": ‘Feminist data protection’ is not an established term or field of study: data protection discourse is dominated by doctrinal legal and economic positions, and feminist perspectives are few and far between. This editorial introduction summarises a number of recent interventions in the broader fields of data sciences and surveillance studies, then turns to data protection itself and considers how it might be understood, critiqued and possibly reimagined in feminist terms. Finally, the authors return to ‘feminist data protection’ and the different directions in which it might be further developed—as a feminist approach to data protection,","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130021512","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Naming something collective does not make it so: algorithmic discrimination and access to justice 将某些东西命名为集体并不能使其成为集体:算法歧视和诉诸司法
Internet Policy Rev. Pub Date : 2021-12-07 DOI: 10.14763/2021.4.1600
Jenni Hakkarainen
{"title":"Naming something collective does not make it so: algorithmic discrimination and access to justice","authors":"Jenni Hakkarainen","doi":"10.14763/2021.4.1600","DOIUrl":"https://doi.org/10.14763/2021.4.1600","url":null,"abstract":"The article problematises the ability of procedural law to address and correct algorithmic discrimination. It argues that algorithmic discrimination is a collective phenomenon, and therefore legal protection thereof needs to be collective. Legal procedures are technologies and design objects that embed values that can affect their usability to perform the task they are built for. Drawing from science and technology studies (STS) and feminist critique on law, the article argues that procedural law fails to address algorithmic discrimination, as legal protection is built on datacentrism and individual-centred law. As to the future of new procedural design, it suggests collective redress in the form of ex ante protection as a promising way forward. Issue 4 This paper is part of Feminist data protection, a special issue of Internet Policy Review guest-edited by Jens T. Theilen, Andreas Baur, Felix Bieker, Regina Ammicht Quinn, Marit Hansen, and Gloria González Fuster. Introduction: Technology, discrimination and access to justice We discriminate. Discrimination can be intentional or unconscious, as human beings have biased attitudes towards other people. These attitudes become integral parts of institutions and their decision-making processes, where they eventually promote inequality more broadly. Human beings are not the only ones who embed and mediate values. Things are also claimed to have politics (Marres, 2012; Winner, 1980). The intended and unintended values of the participants who contribute to the design of a technological artefact are performed through those technologies (Pfaffenberger, 1992). Technologies, therefore, can mediate biased attitudes and inequality. Recent advances in the field of artificial intelligence (AI) have raised concerns about how data-driven technologies, more specifically automated decision-making tools, mediate discrimination. These concerns have turned into reality, as computer programmes used e.g. in recruiting1 have been discovered to produce biased decisions. Feminist scholar Cathy O’Neil argues that algorithmic discrimination is most likely to affect minorities, low-income people or people with disabilities through automated social welfare systems and credit scoring (O’Neil, 2016). In addition, discriminating algorithms have been embedded in predictive policing systems2 and platform economy governance3, just to name two examples. Despite growing awareness, the current research and institutional responses to discrimination are partly hindered by their inability to recognise the connections between algorithmic discrimination and the long history of research on discrimination as well as the role that technologies play in postmodern society. For example, non-technology specific contemporary feminism has argued since the 1960s that discrimination is a matter of collective experience, enabled by societal and legal 1. See https://www.reuters.com/article/us-amazon-com-jobs-automation-insight-idUSKCN1MK08G, page visited","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129909956","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Bleeding data: the case of fertility and menstruation tracking apps 出血数据:生育和月经跟踪应用程序的案例
Internet Policy Rev. Pub Date : 2021-12-07 DOI: 10.14763/2021.4.1599
Anastasia Siapka, E. Biasin
{"title":"Bleeding data: the case of fertility and menstruation tracking apps","authors":"Anastasia Siapka, E. Biasin","doi":"10.14763/2021.4.1599","DOIUrl":"https://doi.org/10.14763/2021.4.1599","url":null,"abstract":": Journalists, non-profits and consumer organisations, as well as the authors’ first-hand review of relevant privacy policies reveal that fertility and menstruation tracking apps (FMTs) collect and share an excessive array of data. Through doctrinal legal research, we evaluate this data processing in light of data and consumer protection law but find the commonly invoked concepts of ‘vulnerability’, ‘consent’ and ‘transparency’ insufficient to alleviate power imbalances. Instead, drawing on a feminist understanding of work and the autonomist ‘social factory’, we argue that users perform unpaid, even gendered, consumer labour in the digital realm and explore the potential of a demand for wages.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"78 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134285618","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Bias does not equal bias: a socio-technical typology of bias in data-based algorithmic systems 偏见不等于偏见:基于数据的算法系统中偏见的社会技术类型
Internet Policy Rev. Pub Date : 2021-12-07 DOI: 10.14763/2021.4.1598
Paola Lopez
{"title":"Bias does not equal bias: a socio-technical typology of bias in data-based algorithmic systems","authors":"Paola Lopez","doi":"10.14763/2021.4.1598","DOIUrl":"https://doi.org/10.14763/2021.4.1598","url":null,"abstract":": This paper introduces a socio-technical typology of bias in data-driven machine learning and artificial intelligence systems. The typology is linked to the conceptualisations of legal antidiscrimination regulations, so that the concept of structural inequality—and, therefore, of undesirable bias—is defined accordingly. By analysing the controversial Austrian “AMS algorithm” as a case study as well as examples in the contexts of face detection, risk assessment and health care management, this paper defines the following three types of bias: firstly, purely technical bias as a systematic deviation of the datafied version of a phenomenon from reality; secondly, socio-technical bias as a systematic deviation due to structural inequalities, which must be strictly distinguished from, thirdly, societal bias, which depicts—correctly—the structural inequalities that prevail in society. This paper argues that a clear distinction must be made between different concepts of bias in such systems in order to analytically assess these systems and, subsequently, inform political action.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"269 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116066690","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Whiteness in and through data protection: an intersectional approach to anti-violence apps and #MeToo bots 数据保护中的白人和通过数据保护的白人:反暴力应用程序和#MeToo机器人的交叉方法
Internet Policy Rev. Pub Date : 2021-12-07 DOI: 10.14763/2021.4.1589
R. Shelby, J. Harb, Kathryn Henne
{"title":"Whiteness in and through data protection: an intersectional approach to anti-violence apps and #MeToo bots","authors":"R. Shelby, J. Harb, Kathryn Henne","doi":"10.14763/2021.4.1589","DOIUrl":"https://doi.org/10.14763/2021.4.1589","url":null,"abstract":": This article analyses apps and artificial intelligence chatbots designed to offer survivors of sexual violence with emergency assistance, education, and a means to report and build evidence against perpetrators. Demonstrating how these technologies both confront and constitute forms of oppression, this analysis complicates assumptions about data protection through an intersectional feminist examination of these digital tools. In surveying different anti-violence apps, we interrogate how the racial formation of whiteness manifests in ways that can be understood as the political, representational, and structural intersectional dimensions of data protection.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"33 4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129853494","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Prescripted living: gender stereotypes and data-based surveillance in the UK welfare state 规定的生活:英国福利国家的性别刻板印象和基于数据的监控
Internet Policy Rev. Pub Date : 2021-12-07 DOI: 10.14763/2021.4.1593
L. Carter
{"title":"Prescripted living: gender stereotypes and data-based surveillance in the UK welfare state","authors":"L. Carter","doi":"10.14763/2021.4.1593","DOIUrl":"https://doi.org/10.14763/2021.4.1593","url":null,"abstract":": The welfare benefits system in the UK has historically favoured individuals who conform to gender stereotypes: it also increasingly uses surveillance and conditionality to determine who is ‘deserving’ of support. This paper argues that this combination reinforces structures of categorisation and control, risking a vicious cycle which causes harm at both an individual and societal level: it also argues that human rights offers a tool for analysis and resistance to this harm.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128071766","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Artificial intelligence and consent: a feminist anti-colonial critique 人工智能与同意:女权主义反殖民主义批判
Internet Policy Rev. Pub Date : 2021-12-07 DOI: 10.14763/2021.4.1602
Joana Varon, P. Peña
{"title":"Artificial intelligence and consent: a feminist anti-colonial critique","authors":"Joana Varon, P. Peña","doi":"10.14763/2021.4.1602","DOIUrl":"https://doi.org/10.14763/2021.4.1602","url":null,"abstract":"Feminist theories have extensively debated consent in sexual and political contexts. But what does it mean to consent when we are talking about our data bodies feeding artificial intelligence (AI) systems? This article builds a feminist and anti-colonial critique about how an individualistic notion of consent is being used to legitimate practices of the so-called emerging Digital Welfare States, focused on digitalisation of anti-poverty programmes. The goal is to expose how the functional role of digital consent has been enabling data extractivist practices for control and exclusion, another manifestation of colonialism embedded in cutting-edge digital technology. Issue 4 This paper is part of Feminist data protection, a special issue of Internet Policy Review guest-edited by Jens T. Theilen, Andreas Baur, Felix Bieker, Regina Ammicht Quinn, Marit Hansen, and Gloria González Fuster.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128484011","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Data and Afrofuturism: an emancipated subject? 数据与非洲未来主义:一个被解放的主体?
Internet Policy Rev. Pub Date : 2021-12-07 DOI: 10.14763/2021.4.1597
Aisha P.L. Kadiri
{"title":"Data and Afrofuturism: an emancipated subject?","authors":"Aisha P.L. Kadiri","doi":"10.14763/2021.4.1597","DOIUrl":"https://doi.org/10.14763/2021.4.1597","url":null,"abstract":": The concept of an individual, liberal data subject, who was traditionally at the centre of data protection efforts has recently come under scrutiny. At the same time, the particularly destructive effect of digital technology on Black people establishes the need for an analysis that not only considers but brings racial dimensions to the forefront. I argue that because Afrofuturism situates the Black struggle in persistent, yet continuously changing structural disparities and power relations, it offers a powerful departure point for re-imagining data protection. Sketching an Afrofuturist data subject then centres on radical subjectivity, collectivity, and contextuality.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125002349","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
What we do with data: a performative critique of data 'collection' 我们用数据做什么:对数据“收集”的行为批判
Internet Policy Rev. Pub Date : 2021-12-07 DOI: 10.14763/2021.4.1588
Garfield Benjamin
{"title":"What we do with data: a performative critique of data 'collection'","authors":"Garfield Benjamin","doi":"10.14763/2021.4.1588","DOIUrl":"https://doi.org/10.14763/2021.4.1588","url":null,"abstract":": Data collection is everywhere. It happens overtly and behind the scenes. It is a specific moment of legal obligation, the point at which the purpose and conditions of the data are legitimised. But what does the term data collection mean? What does it say or not say? Does it really capture the extraction or imposition taking place? How do terms and practices relate in defining the norms of data in society? This article undertakes a critique of data collection using data feminism and a performative theory of privacy: as a resource, an objective discovery and an assumption. It also discusses alternative terms and the implications of how we describe practices of ‘collecting’ data.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115899860","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
The exploitation of vulnerability through personalised marketing communication: are consumers protected? 通过个性化营销传播利用脆弱性:消费者受到保护了吗?
Internet Policy Rev. Pub Date : 2021-11-08 DOI: 10.14763/2021.4.1585
J. Strycharz, B. Duivenvoorde
{"title":"The exploitation of vulnerability through personalised marketing communication: are consumers protected?","authors":"J. Strycharz, B. Duivenvoorde","doi":"10.14763/2021.4.1585","DOIUrl":"https://doi.org/10.14763/2021.4.1585","url":null,"abstract":": While data-driven personalisation strategies in marketing offer consumers several benefits, they potentially also create new disparities and vulnerabilities in society, and in individuals. This article explores in what ways application of so-called personalised marketing communication may lead to exploitation of vulnerability of consumers and builds on empirical findings on the issue by investigating if consumers are protected against such vulnerabilities under EU consumer protection law. We show a number of ways in which personalisation may lead to exploitation of internal and external vulnerabilities and that EU consumer law contains significant barriers to effectively address such exploitation.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130571768","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信