Jens T. Theilen, A. Baur, Felix Bieker, R. Quinn, M. Hansen, G. G. Fuster
{"title":"Feminist data protection: an introduction","authors":"Jens T. Theilen, A. Baur, Felix Bieker, R. Quinn, M. Hansen, G. G. Fuster","doi":"10.14763/2021.4.1609","DOIUrl":"https://doi.org/10.14763/2021.4.1609","url":null,"abstract":": ‘Feminist data protection’ is not an established term or field of study: data protection discourse is dominated by doctrinal legal and economic positions, and feminist perspectives are few and far between. This editorial introduction summarises a number of recent interventions in the broader fields of data sciences and surveillance studies, then turns to data protection itself and considers how it might be understood, critiqued and possibly reimagined in feminist terms. Finally, the authors return to ‘feminist data protection’ and the different directions in which it might be further developed—as a feminist approach to data protection,","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130021512","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Naming something collective does not make it so: algorithmic discrimination and access to justice","authors":"Jenni Hakkarainen","doi":"10.14763/2021.4.1600","DOIUrl":"https://doi.org/10.14763/2021.4.1600","url":null,"abstract":"The article problematises the ability of procedural law to address and correct algorithmic discrimination. It argues that algorithmic discrimination is a collective phenomenon, and therefore legal protection thereof needs to be collective. Legal procedures are technologies and design objects that embed values that can affect their usability to perform the task they are built for. Drawing from science and technology studies (STS) and feminist critique on law, the article argues that procedural law fails to address algorithmic discrimination, as legal protection is built on datacentrism and individual-centred law. As to the future of new procedural design, it suggests collective redress in the form of ex ante protection as a promising way forward. Issue 4 This paper is part of Feminist data protection, a special issue of Internet Policy Review guest-edited by Jens T. Theilen, Andreas Baur, Felix Bieker, Regina Ammicht Quinn, Marit Hansen, and Gloria González Fuster. Introduction: Technology, discrimination and access to justice We discriminate. Discrimination can be intentional or unconscious, as human beings have biased attitudes towards other people. These attitudes become integral parts of institutions and their decision-making processes, where they eventually promote inequality more broadly. Human beings are not the only ones who embed and mediate values. Things are also claimed to have politics (Marres, 2012; Winner, 1980). The intended and unintended values of the participants who contribute to the design of a technological artefact are performed through those technologies (Pfaffenberger, 1992). Technologies, therefore, can mediate biased attitudes and inequality. Recent advances in the field of artificial intelligence (AI) have raised concerns about how data-driven technologies, more specifically automated decision-making tools, mediate discrimination. These concerns have turned into reality, as computer programmes used e.g. in recruiting1 have been discovered to produce biased decisions. Feminist scholar Cathy O’Neil argues that algorithmic discrimination is most likely to affect minorities, low-income people or people with disabilities through automated social welfare systems and credit scoring (O’Neil, 2016). In addition, discriminating algorithms have been embedded in predictive policing systems2 and platform economy governance3, just to name two examples. Despite growing awareness, the current research and institutional responses to discrimination are partly hindered by their inability to recognise the connections between algorithmic discrimination and the long history of research on discrimination as well as the role that technologies play in postmodern society. For example, non-technology specific contemporary feminism has argued since the 1960s that discrimination is a matter of collective experience, enabled by societal and legal 1. See https://www.reuters.com/article/us-amazon-com-jobs-automation-insight-idUSKCN1MK08G, page visited","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129909956","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Bleeding data: the case of fertility and menstruation tracking apps","authors":"Anastasia Siapka, E. Biasin","doi":"10.14763/2021.4.1599","DOIUrl":"https://doi.org/10.14763/2021.4.1599","url":null,"abstract":": Journalists, non-profits and consumer organisations, as well as the authors’ first-hand review of relevant privacy policies reveal that fertility and menstruation tracking apps (FMTs) collect and share an excessive array of data. Through doctrinal legal research, we evaluate this data processing in light of data and consumer protection law but find the commonly invoked concepts of ‘vulnerability’, ‘consent’ and ‘transparency’ insufficient to alleviate power imbalances. Instead, drawing on a feminist understanding of work and the autonomist ‘social factory’, we argue that users perform unpaid, even gendered, consumer labour in the digital realm and explore the potential of a demand for wages.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"78 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134285618","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Bias does not equal bias: a socio-technical typology of bias in data-based algorithmic systems","authors":"Paola Lopez","doi":"10.14763/2021.4.1598","DOIUrl":"https://doi.org/10.14763/2021.4.1598","url":null,"abstract":": This paper introduces a socio-technical typology of bias in data-driven machine learning and artificial intelligence systems. The typology is linked to the conceptualisations of legal antidiscrimination regulations, so that the concept of structural inequality—and, therefore, of undesirable bias—is defined accordingly. By analysing the controversial Austrian “AMS algorithm” as a case study as well as examples in the contexts of face detection, risk assessment and health care management, this paper defines the following three types of bias: firstly, purely technical bias as a systematic deviation of the datafied version of a phenomenon from reality; secondly, socio-technical bias as a systematic deviation due to structural inequalities, which must be strictly distinguished from, thirdly, societal bias, which depicts—correctly—the structural inequalities that prevail in society. This paper argues that a clear distinction must be made between different concepts of bias in such systems in order to analytically assess these systems and, subsequently, inform political action.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"269 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116066690","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Whiteness in and through data protection: an intersectional approach to anti-violence apps and #MeToo bots","authors":"R. Shelby, J. Harb, Kathryn Henne","doi":"10.14763/2021.4.1589","DOIUrl":"https://doi.org/10.14763/2021.4.1589","url":null,"abstract":": This article analyses apps and artificial intelligence chatbots designed to offer survivors of sexual violence with emergency assistance, education, and a means to report and build evidence against perpetrators. Demonstrating how these technologies both confront and constitute forms of oppression, this analysis complicates assumptions about data protection through an intersectional feminist examination of these digital tools. In surveying different anti-violence apps, we interrogate how the racial formation of whiteness manifests in ways that can be understood as the political, representational, and structural intersectional dimensions of data protection.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"33 4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129853494","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Prescripted living: gender stereotypes and data-based surveillance in the UK welfare state","authors":"L. Carter","doi":"10.14763/2021.4.1593","DOIUrl":"https://doi.org/10.14763/2021.4.1593","url":null,"abstract":": The welfare benefits system in the UK has historically favoured individuals who conform to gender stereotypes: it also increasingly uses surveillance and conditionality to determine who is ‘deserving’ of support. This paper argues that this combination reinforces structures of categorisation and control, risking a vicious cycle which causes harm at both an individual and societal level: it also argues that human rights offers a tool for analysis and resistance to this harm.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128071766","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Artificial intelligence and consent: a feminist anti-colonial critique","authors":"Joana Varon, P. Peña","doi":"10.14763/2021.4.1602","DOIUrl":"https://doi.org/10.14763/2021.4.1602","url":null,"abstract":"Feminist theories have extensively debated consent in sexual and political contexts. But what does it mean to consent when we are talking about our data bodies feeding artificial intelligence (AI) systems? This article builds a feminist and anti-colonial critique about how an individualistic notion of consent is being used to legitimate practices of the so-called emerging Digital Welfare States, focused on digitalisation of anti-poverty programmes. The goal is to expose how the functional role of digital consent has been enabling data extractivist practices for control and exclusion, another manifestation of colonialism embedded in cutting-edge digital technology. Issue 4 This paper is part of Feminist data protection, a special issue of Internet Policy Review guest-edited by Jens T. Theilen, Andreas Baur, Felix Bieker, Regina Ammicht Quinn, Marit Hansen, and Gloria González Fuster.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128484011","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Data and Afrofuturism: an emancipated subject?","authors":"Aisha P.L. Kadiri","doi":"10.14763/2021.4.1597","DOIUrl":"https://doi.org/10.14763/2021.4.1597","url":null,"abstract":": The concept of an individual, liberal data subject, who was traditionally at the centre of data protection efforts has recently come under scrutiny. At the same time, the particularly destructive effect of digital technology on Black people establishes the need for an analysis that not only considers but brings racial dimensions to the forefront. I argue that because Afrofuturism situates the Black struggle in persistent, yet continuously changing structural disparities and power relations, it offers a powerful departure point for re-imagining data protection. Sketching an Afrofuturist data subject then centres on radical subjectivity, collectivity, and contextuality.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125002349","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"What we do with data: a performative critique of data 'collection'","authors":"Garfield Benjamin","doi":"10.14763/2021.4.1588","DOIUrl":"https://doi.org/10.14763/2021.4.1588","url":null,"abstract":": Data collection is everywhere. It happens overtly and behind the scenes. It is a specific moment of legal obligation, the point at which the purpose and conditions of the data are legitimised. But what does the term data collection mean? What does it say or not say? Does it really capture the extraction or imposition taking place? How do terms and practices relate in defining the norms of data in society? This article undertakes a critique of data collection using data feminism and a performative theory of privacy: as a resource, an objective discovery and an assumption. It also discusses alternative terms and the implications of how we describe practices of ‘collecting’ data.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115899860","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The exploitation of vulnerability through personalised marketing communication: are consumers protected?","authors":"J. Strycharz, B. Duivenvoorde","doi":"10.14763/2021.4.1585","DOIUrl":"https://doi.org/10.14763/2021.4.1585","url":null,"abstract":": While data-driven personalisation strategies in marketing offer consumers several benefits, they potentially also create new disparities and vulnerabilities in society, and in individuals. This article explores in what ways application of so-called personalised marketing communication may lead to exploitation of vulnerability of consumers and builds on empirical findings on the issue by investigating if consumers are protected against such vulnerabilities under EU consumer protection law. We show a number of ways in which personalisation may lead to exploitation of internal and external vulnerabilities and that EU consumer law contains significant barriers to effectively address such exploitation.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130571768","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}