局部不同私有机制的收缩

Shahab Asoodeh;Huanyu Zhang
{"title":"局部不同私有机制的收缩","authors":"Shahab Asoodeh;Huanyu Zhang","doi":"10.1109/JSAIT.2024.3397305","DOIUrl":null,"url":null,"abstract":"We investigate the contraction properties of locally differentially private mechanisms. More specifically, we derive tight upper bounds on the divergence between \n<inline-formula> <tex-math>$P{\\mathsf K}$ </tex-math></inline-formula>\n and \n<inline-formula> <tex-math>$Q{\\mathsf K}$ </tex-math></inline-formula>\n output distributions of an \n<inline-formula> <tex-math>$\\varepsilon $ </tex-math></inline-formula>\n-LDP mechanism \n<inline-formula> <tex-math>$\\mathsf K$ </tex-math></inline-formula>\n in terms of a divergence between the corresponding input distributions P and Q, respectively. Our first main technical result presents a sharp upper bound on the \n<inline-formula> <tex-math>$\\chi ^{2}$ </tex-math></inline-formula>\n-divergence \n<inline-formula> <tex-math>$\\chi ^{2}(P{\\mathsf K}\\|Q{\\mathsf K})$ </tex-math></inline-formula>\n in terms of \n<inline-formula> <tex-math>$\\chi ^{2}(P\\|Q)$ </tex-math></inline-formula>\n and \n<inline-formula> <tex-math>$\\varepsilon $ </tex-math></inline-formula>\n. We also show that the same result holds for a large family of divergences, including KL-divergence and squared Hellinger distance. The second main technical result gives an upper bound on \n<inline-formula> <tex-math>$\\chi ^{2}(P{\\mathsf K}\\|Q{\\mathsf K})$ </tex-math></inline-formula>\n in terms of total variation distance \n<inline-formula> <tex-math>${\\textsf {TV}}(P, Q)$ </tex-math></inline-formula>\n and \n<inline-formula> <tex-math>$\\varepsilon $ </tex-math></inline-formula>\n. We then utilize these bounds to establish locally private versions of the van Trees inequality, Le Cam’s, Assouad’s, and the mutual information methods —powerful tools for bounding minimax estimation risks. These results are shown to lead to tighter privacy analyses than the state-of-the-arts in several statistical problems such as entropy and discrete distribution estimation, non-parametric density estimation, and hypothesis testing.","PeriodicalId":73295,"journal":{"name":"IEEE journal on selected areas in information theory","volume":"5 ","pages":"385-395"},"PeriodicalIF":0.0000,"publicationDate":"2024-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Contraction of Locally Differentially Private Mechanisms\",\"authors\":\"Shahab Asoodeh;Huanyu Zhang\",\"doi\":\"10.1109/JSAIT.2024.3397305\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We investigate the contraction properties of locally differentially private mechanisms. More specifically, we derive tight upper bounds on the divergence between \\n<inline-formula> <tex-math>$P{\\\\mathsf K}$ </tex-math></inline-formula>\\n and \\n<inline-formula> <tex-math>$Q{\\\\mathsf K}$ </tex-math></inline-formula>\\n output distributions of an \\n<inline-formula> <tex-math>$\\\\varepsilon $ </tex-math></inline-formula>\\n-LDP mechanism \\n<inline-formula> <tex-math>$\\\\mathsf K$ </tex-math></inline-formula>\\n in terms of a divergence between the corresponding input distributions P and Q, respectively. Our first main technical result presents a sharp upper bound on the \\n<inline-formula> <tex-math>$\\\\chi ^{2}$ </tex-math></inline-formula>\\n-divergence \\n<inline-formula> <tex-math>$\\\\chi ^{2}(P{\\\\mathsf K}\\\\|Q{\\\\mathsf K})$ </tex-math></inline-formula>\\n in terms of \\n<inline-formula> <tex-math>$\\\\chi ^{2}(P\\\\|Q)$ </tex-math></inline-formula>\\n and \\n<inline-formula> <tex-math>$\\\\varepsilon $ </tex-math></inline-formula>\\n. We also show that the same result holds for a large family of divergences, including KL-divergence and squared Hellinger distance. The second main technical result gives an upper bound on \\n<inline-formula> <tex-math>$\\\\chi ^{2}(P{\\\\mathsf K}\\\\|Q{\\\\mathsf K})$ </tex-math></inline-formula>\\n in terms of total variation distance \\n<inline-formula> <tex-math>${\\\\textsf {TV}}(P, Q)$ </tex-math></inline-formula>\\n and \\n<inline-formula> <tex-math>$\\\\varepsilon $ </tex-math></inline-formula>\\n. We then utilize these bounds to establish locally private versions of the van Trees inequality, Le Cam’s, Assouad’s, and the mutual information methods —powerful tools for bounding minimax estimation risks. These results are shown to lead to tighter privacy analyses than the state-of-the-arts in several statistical problems such as entropy and discrete distribution estimation, non-parametric density estimation, and hypothesis testing.\",\"PeriodicalId\":73295,\"journal\":{\"name\":\"IEEE journal on selected areas in information theory\",\"volume\":\"5 \",\"pages\":\"385-395\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-03-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE journal on selected areas in information theory\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10527360/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE journal on selected areas in information theory","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10527360/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

我们研究了局部差异私有机制的收缩特性。更具体地说,我们推导了$\varepsilon $ -LDP 机制$\mathsf K$的$P{\mathsf K}$和$Q{\mathsf K}$输出分布之间的发散的严格上限,它们分别是对应的输入分布P和Q之间的发散。我们的第一个主要技术结果以 $\chi ^{2}(P\|Q)$ 和 $\varepsilon $ 的形式给出了 $\chi ^{2}$ -发散 $\chi ^{2}(P{\mathsf K}\|Q{\mathsf K})$的尖锐上限。 我们还证明,同样的结果也适用于包括 KL 发散和平方海灵格距离在内的一大系列发散。第二个主要技术结果给出了总变异距离 ${textsf {TV}}(P, Q)$ 和 $\varepsilon $ 的 $chi ^{2}(P{\mathsf K}\|Q{\mathsf K})$上界。 然后,我们利用这些上界建立了本地私有版本的范特里不等式、勒卡姆方法、阿苏阿德方法和互信息方法--这些都是约束最小估计风险的有力工具。这些结果表明,在熵和离散分布估计、非参数密度估计和假设检验等多个统计问题中,隐私分析比现有技术更严密。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Contraction of Locally Differentially Private Mechanisms
We investigate the contraction properties of locally differentially private mechanisms. More specifically, we derive tight upper bounds on the divergence between $P{\mathsf K}$ and $Q{\mathsf K}$ output distributions of an $\varepsilon $ -LDP mechanism $\mathsf K$ in terms of a divergence between the corresponding input distributions P and Q, respectively. Our first main technical result presents a sharp upper bound on the $\chi ^{2}$ -divergence $\chi ^{2}(P{\mathsf K}\|Q{\mathsf K})$ in terms of $\chi ^{2}(P\|Q)$ and $\varepsilon $ . We also show that the same result holds for a large family of divergences, including KL-divergence and squared Hellinger distance. The second main technical result gives an upper bound on $\chi ^{2}(P{\mathsf K}\|Q{\mathsf K})$ in terms of total variation distance ${\textsf {TV}}(P, Q)$ and $\varepsilon $ . We then utilize these bounds to establish locally private versions of the van Trees inequality, Le Cam’s, Assouad’s, and the mutual information methods —powerful tools for bounding minimax estimation risks. These results are shown to lead to tighter privacy analyses than the state-of-the-arts in several statistical problems such as entropy and discrete distribution estimation, non-parametric density estimation, and hypothesis testing.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
8.20
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信