差分隐私的承诺:算法技术教程

C. Dwork
{"title":"差分隐私的承诺:算法技术教程","authors":"C. Dwork","doi":"10.1109/FOCS.2011.88","DOIUrl":null,"url":null,"abstract":"{\\em Differential privacy} describes a promise, made by a data curator to a data subject: you will not be affected, adversely or otherwise, by allowing your data to be used in any study, no matter what other studies, data sets, or information from other sources is available. At their best, differentially private database mechanisms can make confidential data widely available for accurate data analysis, without resorting to data clean rooms, institutional review boards, data usage agreements, restricted views, or data protection plans. To enjoy the fruits of the research described in this tutorial, the data analyst must accept that raw data can never be accessed directly and that eventually data utility is consumed: overly accurate answers to too many questions will destroy privacy. The goal of algorithmic research on differential privacy is to postpone this inevitability as long as possible.","PeriodicalId":326048,"journal":{"name":"2011 IEEE 52nd Annual Symposium on Foundations of Computer Science","volume":"41 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"50","resultStr":"{\"title\":\"The Promise of Differential Privacy: A Tutorial on Algorithmic Techniques\",\"authors\":\"C. Dwork\",\"doi\":\"10.1109/FOCS.2011.88\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"{\\\\em Differential privacy} describes a promise, made by a data curator to a data subject: you will not be affected, adversely or otherwise, by allowing your data to be used in any study, no matter what other studies, data sets, or information from other sources is available. At their best, differentially private database mechanisms can make confidential data widely available for accurate data analysis, without resorting to data clean rooms, institutional review boards, data usage agreements, restricted views, or data protection plans. To enjoy the fruits of the research described in this tutorial, the data analyst must accept that raw data can never be accessed directly and that eventually data utility is consumed: overly accurate answers to too many questions will destroy privacy. The goal of algorithmic research on differential privacy is to postpone this inevitability as long as possible.\",\"PeriodicalId\":326048,\"journal\":{\"name\":\"2011 IEEE 52nd Annual Symposium on Foundations of Computer Science\",\"volume\":\"41 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-10-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"50\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2011 IEEE 52nd Annual Symposium on Foundations of Computer Science\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/FOCS.2011.88\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 IEEE 52nd Annual Symposium on Foundations of Computer Science","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/FOCS.2011.88","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 50

摘要

{\em差异隐私}描述了数据管理员对数据主体作出的承诺:允许您的数据在任何研究中使用,无论其他研究、数据集或其他来源的信息如何,您都不会受到不利或其他方面的影响。在最好的情况下,不同的私有数据库机制可以使机密数据广泛地用于准确的数据分析,而无需诉诸于数据洁净室、机构审查委员会、数据使用协议、受限视图或数据保护计划。为了享受本教程中描述的研究成果,数据分析师必须接受这样一个事实:原始数据永远无法直接访问,最终数据效用会被消耗掉:对太多问题的过于准确的答案会破坏隐私。差分隐私算法研究的目标是尽可能地推迟这种必然性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
The Promise of Differential Privacy: A Tutorial on Algorithmic Techniques
{\em Differential privacy} describes a promise, made by a data curator to a data subject: you will not be affected, adversely or otherwise, by allowing your data to be used in any study, no matter what other studies, data sets, or information from other sources is available. At their best, differentially private database mechanisms can make confidential data widely available for accurate data analysis, without resorting to data clean rooms, institutional review boards, data usage agreements, restricted views, or data protection plans. To enjoy the fruits of the research described in this tutorial, the data analyst must accept that raw data can never be accessed directly and that eventually data utility is consumed: overly accurate answers to too many questions will destroy privacy. The goal of algorithmic research on differential privacy is to postpone this inevitability as long as possible.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信