{"title":"知情同意和算法歧视——泄露你的数据是新的弱点吗?","authors":"Hauke Behrendt, Wulf Loh","doi":"10.1080/00346764.2022.2027506","DOIUrl":null,"url":null,"abstract":"This paper discusses various forms and sources of algorithmic discrimination. In particular, we explore the connection between – at first glance – ‘voluntary’ sharing or selling of one’s data on the one hand and potential risks of automated decision-making based on big data and artificial intelligence on the other. We argue that the implementation of algorithm-driven profiling or decision-making mechanisms will, in many cases, disproportionately disadvantage certain vulnerable groups that are already disadvantaged by many existing datafication practices. We call into question the voluntariness of these mechanisms, especially for certain vulnerable groups, and claim that members of such groups are oftentimes more likely to give away their data. If these existing datafication practices exacerbate prior disadvantages, they ‘compound historical injustices’ (Hellman, 2018) and thereby constitute forms of morally wrong discrimination. To make matters worse, they are even more prone to further algorithmic discriminations based on the additional data collected from them.","PeriodicalId":46636,"journal":{"name":"REVIEW OF SOCIAL ECONOMY","volume":"80 1","pages":"58 - 84"},"PeriodicalIF":1.6000,"publicationDate":"2022-01-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Informed consent and algorithmic discrimination – is giving away your data the new vulnerable?\",\"authors\":\"Hauke Behrendt, Wulf Loh\",\"doi\":\"10.1080/00346764.2022.2027506\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper discusses various forms and sources of algorithmic discrimination. In particular, we explore the connection between – at first glance – ‘voluntary’ sharing or selling of one’s data on the one hand and potential risks of automated decision-making based on big data and artificial intelligence on the other. We argue that the implementation of algorithm-driven profiling or decision-making mechanisms will, in many cases, disproportionately disadvantage certain vulnerable groups that are already disadvantaged by many existing datafication practices. We call into question the voluntariness of these mechanisms, especially for certain vulnerable groups, and claim that members of such groups are oftentimes more likely to give away their data. If these existing datafication practices exacerbate prior disadvantages, they ‘compound historical injustices’ (Hellman, 2018) and thereby constitute forms of morally wrong discrimination. To make matters worse, they are even more prone to further algorithmic discriminations based on the additional data collected from them.\",\"PeriodicalId\":46636,\"journal\":{\"name\":\"REVIEW OF SOCIAL ECONOMY\",\"volume\":\"80 1\",\"pages\":\"58 - 84\"},\"PeriodicalIF\":1.6000,\"publicationDate\":\"2022-01-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"REVIEW OF SOCIAL ECONOMY\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1080/00346764.2022.2027506\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ECONOMICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"REVIEW OF SOCIAL ECONOMY","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/00346764.2022.2027506","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ECONOMICS","Score":null,"Total":0}
Informed consent and algorithmic discrimination – is giving away your data the new vulnerable?
This paper discusses various forms and sources of algorithmic discrimination. In particular, we explore the connection between – at first glance – ‘voluntary’ sharing or selling of one’s data on the one hand and potential risks of automated decision-making based on big data and artificial intelligence on the other. We argue that the implementation of algorithm-driven profiling or decision-making mechanisms will, in many cases, disproportionately disadvantage certain vulnerable groups that are already disadvantaged by many existing datafication practices. We call into question the voluntariness of these mechanisms, especially for certain vulnerable groups, and claim that members of such groups are oftentimes more likely to give away their data. If these existing datafication practices exacerbate prior disadvantages, they ‘compound historical injustices’ (Hellman, 2018) and thereby constitute forms of morally wrong discrimination. To make matters worse, they are even more prone to further algorithmic discriminations based on the additional data collected from them.
期刊介绍:
For over sixty-five years, the Review of Social Economy has published high-quality peer-reviewed work on the many relationships between social values and economics. The field of social economics discusses how the economy and social justice relate, and what this implies for economic theory and policy. Papers published range from conceptual work on aligning economic institutions and policies with given ethical principles, to theoretical representations of individual behaviour that allow for both self-interested and "pro-social" motives, and to original empirical work on persistent social issues such as poverty, inequality, and discrimination.