{"title":"Disclosure Risk from Homogeneity Attack in Differentially Private Release of Frequency Distribution","authors":"F. Liu, Xingyuan Zhao","doi":"10.1145/3508398.3519357","DOIUrl":null,"url":null,"abstract":"Differential privacy (DP) provides a robust model to achieve privacy guarantees in released information. We examine the robustness of the protection against homogeneity attack (HA) in multi-dimensional frequency distributions sanitized via DP randomization mechanisms. We propose measures for disclosure risk from HA and derive closed-form relationships between privacy loss parameters in DP and disclosure risk from HA. We also provide a lower bound to the disclosure risk on a sensitive attribute when all the cells formed by quasi-identifiers are homogeneous for the sensitive attribute. The availability of the closed-form relationships helps understand the abstract concepts of DP and privacy loss parameters by putting them in the context of a concrete privacy attack and offers a perspective for choosing privacy loss parameters when employing DP mechanisms to release information in practice. We apply the closed-form mathematical relationships on real-life datasets to assess disclosure risk due to HA in differentially private sanitized frequency distributions at various privacy loss parameters.","PeriodicalId":102306,"journal":{"name":"Proceedings of the Twelfth ACM Conference on Data and Application Security and Privacy","volume":"22 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-04-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Twelfth ACM Conference on Data and Application Security and Privacy","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3508398.3519357","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Differential privacy (DP) provides a robust model to achieve privacy guarantees in released information. We examine the robustness of the protection against homogeneity attack (HA) in multi-dimensional frequency distributions sanitized via DP randomization mechanisms. We propose measures for disclosure risk from HA and derive closed-form relationships between privacy loss parameters in DP and disclosure risk from HA. We also provide a lower bound to the disclosure risk on a sensitive attribute when all the cells formed by quasi-identifiers are homogeneous for the sensitive attribute. The availability of the closed-form relationships helps understand the abstract concepts of DP and privacy loss parameters by putting them in the context of a concrete privacy attack and offers a perspective for choosing privacy loss parameters when employing DP mechanisms to release information in practice. We apply the closed-form mathematical relationships on real-life datasets to assess disclosure risk due to HA in differentially private sanitized frequency distributions at various privacy loss parameters.