{"title":"隐私敏感数据中的集体危害","authors":"Nicholas M. Weber","doi":"10.1109/JCDL52503.2021.00032","DOIUrl":null,"url":null,"abstract":"Privacy protections for human subject data are often focused on reducing individual harms that result from improper disclosure of personally identifiable information. However, in a networked environment where information infrastructures enable rapid sharing and linking of different datasets there exist numerous harms which abstract to group or collective levels. In this paper we discuss how privacy protections aimed at individual harms, as opposed to collective or group harms, results in an incompatible notion of privacy protections for social science research that synthesizes multiple data sources. Using the framework of Contextual Integrity we present empirical scenarios drawn from 17 in-depth interviews with researchers conducting synthetic research using one or more privacy sensitive data sources. We use these scenarios to identify ways that digital infrastructure providers can help social scientists manage collective harms over time through specific, targeted privacy engineering of supporting research infrastructures and data curation.","PeriodicalId":112400,"journal":{"name":"2021 ACM/IEEE Joint Conference on Digital Libraries (JCDL)","volume":"81 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Surfacing Collective Harms in Privacy Sensitive Data\",\"authors\":\"Nicholas M. Weber\",\"doi\":\"10.1109/JCDL52503.2021.00032\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Privacy protections for human subject data are often focused on reducing individual harms that result from improper disclosure of personally identifiable information. However, in a networked environment where information infrastructures enable rapid sharing and linking of different datasets there exist numerous harms which abstract to group or collective levels. In this paper we discuss how privacy protections aimed at individual harms, as opposed to collective or group harms, results in an incompatible notion of privacy protections for social science research that synthesizes multiple data sources. Using the framework of Contextual Integrity we present empirical scenarios drawn from 17 in-depth interviews with researchers conducting synthetic research using one or more privacy sensitive data sources. We use these scenarios to identify ways that digital infrastructure providers can help social scientists manage collective harms over time through specific, targeted privacy engineering of supporting research infrastructures and data curation.\",\"PeriodicalId\":112400,\"journal\":{\"name\":\"2021 ACM/IEEE Joint Conference on Digital Libraries (JCDL)\",\"volume\":\"81 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 ACM/IEEE Joint Conference on Digital Libraries (JCDL)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/JCDL52503.2021.00032\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 ACM/IEEE Joint Conference on Digital Libraries (JCDL)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/JCDL52503.2021.00032","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Surfacing Collective Harms in Privacy Sensitive Data
Privacy protections for human subject data are often focused on reducing individual harms that result from improper disclosure of personally identifiable information. However, in a networked environment where information infrastructures enable rapid sharing and linking of different datasets there exist numerous harms which abstract to group or collective levels. In this paper we discuss how privacy protections aimed at individual harms, as opposed to collective or group harms, results in an incompatible notion of privacy protections for social science research that synthesizes multiple data sources. Using the framework of Contextual Integrity we present empirical scenarios drawn from 17 in-depth interviews with researchers conducting synthetic research using one or more privacy sensitive data sources. We use these scenarios to identify ways that digital infrastructure providers can help social scientists manage collective harms over time through specific, targeted privacy engineering of supporting research infrastructures and data curation.