{"title":"贝叶斯后验抽样隐私损失的新界","authors":"Xingyuan Zhao, F. Liu","doi":"10.1145/3508398.3519355","DOIUrl":null,"url":null,"abstract":"Differential privacy (DP) is a state-of-the-art concept that formalizes privacy guarantees. We derive a new bound for the privacy loss from releasing Bayesian posterior samples in the setting of DP. The new bound is tighter than the existing bounds for common Bayesian models and is also consistent with the likelihood principle. We apply the privacy loss quantified by the new bound to release differentially private synthetic data from Bayesian models in several experiments and show the improved utility of the synthetic data compared to those generated from explicitly designed randomization mechanisms that privatize posterior distributions.","PeriodicalId":102306,"journal":{"name":"Proceedings of the Twelfth ACM Conference on Data and Application Security and Privacy","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-04-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A New Bound for Privacy Loss from Bayesian Posterior Sampling\",\"authors\":\"Xingyuan Zhao, F. Liu\",\"doi\":\"10.1145/3508398.3519355\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Differential privacy (DP) is a state-of-the-art concept that formalizes privacy guarantees. We derive a new bound for the privacy loss from releasing Bayesian posterior samples in the setting of DP. The new bound is tighter than the existing bounds for common Bayesian models and is also consistent with the likelihood principle. We apply the privacy loss quantified by the new bound to release differentially private synthetic data from Bayesian models in several experiments and show the improved utility of the synthetic data compared to those generated from explicitly designed randomization mechanisms that privatize posterior distributions.\",\"PeriodicalId\":102306,\"journal\":{\"name\":\"Proceedings of the Twelfth ACM Conference on Data and Application Security and Privacy\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-04-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the Twelfth ACM Conference on Data and Application Security and Privacy\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3508398.3519355\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Twelfth ACM Conference on Data and Application Security and Privacy","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3508398.3519355","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A New Bound for Privacy Loss from Bayesian Posterior Sampling
Differential privacy (DP) is a state-of-the-art concept that formalizes privacy guarantees. We derive a new bound for the privacy loss from releasing Bayesian posterior samples in the setting of DP. The new bound is tighter than the existing bounds for common Bayesian models and is also consistent with the likelihood principle. We apply the privacy loss quantified by the new bound to release differentially private synthetic data from Bayesian models in several experiments and show the improved utility of the synthetic data compared to those generated from explicitly designed randomization mechanisms that privatize posterior distributions.