{"title":"Differentially Private Instance-Based Noise Mechanisms in Practice","authors":"Sébastien Canard, Baptiste Olivier, Tony Quertier","doi":"10.1109/PST.2017.00022","DOIUrl":null,"url":null,"abstract":"Differential privacy is a widely used privacy model today, whose privacy guarantees are obtained to the price of a random perturbation of the result. In some situations, basic differentially private mechanisms may add too much noise to reach a reasonable level of privacy. To answer this shortcoming, several works have provided more technically involved mechanisms, using a new paradigm of differentially private mechanisms called instance-based noise mechanisms. In this paper, we exhibit for the first time theoretical conditions for an instance-based noise mechanism to be (∊,δ)-differentially private. We exploit the simplicity of these conditions to design a novel instance-based noise differentially private mechanism. Conducting experimental evaluations, we show that our mechanism compares favorably to existing instance-based noise mechanisms, either regarding time complexity or accuracy of the sanitized result. By contrast with some prior works, our algorithms do not involve the computation of all local sensitivities, a computational task which was proved to be NP hard in some cases, namely for statistic queries on graphs. Our framework is as general as possible and can be used to answer any query, which is in contrast with recent designs of instance-based noise mechanisms where only graph statistics queries are considered.","PeriodicalId":405887,"journal":{"name":"2017 15th Annual Conference on Privacy, Security and Trust (PST)","volume":"45 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 15th Annual Conference on Privacy, Security and Trust (PST)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PST.2017.00022","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Differential privacy is a widely used privacy model today, whose privacy guarantees are obtained to the price of a random perturbation of the result. In some situations, basic differentially private mechanisms may add too much noise to reach a reasonable level of privacy. To answer this shortcoming, several works have provided more technically involved mechanisms, using a new paradigm of differentially private mechanisms called instance-based noise mechanisms. In this paper, we exhibit for the first time theoretical conditions for an instance-based noise mechanism to be (∊,δ)-differentially private. We exploit the simplicity of these conditions to design a novel instance-based noise differentially private mechanism. Conducting experimental evaluations, we show that our mechanism compares favorably to existing instance-based noise mechanisms, either regarding time complexity or accuracy of the sanitized result. By contrast with some prior works, our algorithms do not involve the computation of all local sensitivities, a computational task which was proved to be NP hard in some cases, namely for statistic queries on graphs. Our framework is as general as possible and can be used to answer any query, which is in contrast with recent designs of instance-based noise mechanisms where only graph statistics queries are considered.