Tight Bounds for Communication-Assisted Agreement Distillation

V. Guruswami, J. Radhakrishnan
{"title":"Tight Bounds for Communication-Assisted Agreement Distillation","authors":"V. Guruswami, J. Radhakrishnan","doi":"10.4230/LIPIcs.CCC.2016.6","DOIUrl":null,"url":null,"abstract":"Suppose Alice holds a uniformly random string X ∈ {0, 1}N and Bob holds a noisy version Y of X where each bit of X is flipped independently with probability e ∈ [0, 1/2]. Alice and Bob would like to extract a common random string of min-entropy at least k. In this work, we establish the communication versus success probability trade-off for this problem by giving a protocol and a matching lower bound (under the restriction that the string to be agreed upon is determined by Alice's input X). Specifically, we prove that in order for Alice and Bob to agree on a common string with probability 2-γk (γk ≥ 1), the optimal communication (up to o(k) terms, and achievable for large N) is precisely [EQUATION], where C:= 4e(1 - e). In particular, the optimal communication to achieve Ω(1) agreement probability approaches 4e(1 - e)k. \n \nWe also consider the case when Y is the output of the binary erasure channel on X, where each bit of Y equals the corresponding bit of X with probability 1 - e and is otherwise erased (that is, replaced by a '?'). In this case, the communication required becomes [EQUATION]. In particular, the optimal communication to achieve Ω(1) agreement probability approaches ek, and with no communication the optimal agreement probability approaches [EQUATION]. \n \nOur protocols are based on covering codes and extend the approach of (Bogdanov and Mossel, 2011) for the zero-communication case. Our lower bounds rely on hypercontractive inequalities. For the model of bit-flips, our argument extends the approach of (Bogdanov and Mossel, 2011) by allowing communication; for the erasure model, to the best of our knowledge the needed hypercontractivity statement was not studied before, and it was established (given our application) by (Nair and Wang 2015). We also obtain information complexity lower bounds for these tasks, and together with our protocol, they shed light on the recently popular \"most informative Boolean function\" conjecture of Courtade and Kumar.","PeriodicalId":246506,"journal":{"name":"Cybersecurity and Cyberforensics Conference","volume":"66 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"14","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cybersecurity and Cyberforensics Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4230/LIPIcs.CCC.2016.6","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 14

Abstract

Suppose Alice holds a uniformly random string X ∈ {0, 1}N and Bob holds a noisy version Y of X where each bit of X is flipped independently with probability e ∈ [0, 1/2]. Alice and Bob would like to extract a common random string of min-entropy at least k. In this work, we establish the communication versus success probability trade-off for this problem by giving a protocol and a matching lower bound (under the restriction that the string to be agreed upon is determined by Alice's input X). Specifically, we prove that in order for Alice and Bob to agree on a common string with probability 2-γk (γk ≥ 1), the optimal communication (up to o(k) terms, and achievable for large N) is precisely [EQUATION], where C:= 4e(1 - e). In particular, the optimal communication to achieve Ω(1) agreement probability approaches 4e(1 - e)k. We also consider the case when Y is the output of the binary erasure channel on X, where each bit of Y equals the corresponding bit of X with probability 1 - e and is otherwise erased (that is, replaced by a '?'). In this case, the communication required becomes [EQUATION]. In particular, the optimal communication to achieve Ω(1) agreement probability approaches ek, and with no communication the optimal agreement probability approaches [EQUATION]. Our protocols are based on covering codes and extend the approach of (Bogdanov and Mossel, 2011) for the zero-communication case. Our lower bounds rely on hypercontractive inequalities. For the model of bit-flips, our argument extends the approach of (Bogdanov and Mossel, 2011) by allowing communication; for the erasure model, to the best of our knowledge the needed hypercontractivity statement was not studied before, and it was established (given our application) by (Nair and Wang 2015). We also obtain information complexity lower bounds for these tasks, and together with our protocol, they shed light on the recently popular "most informative Boolean function" conjecture of Courtade and Kumar.
通信辅助协议蒸馏的紧密边界
假设Alice持有一致随机字符串X∈{0,1}N, Bob持有X的噪声版本Y,其中X的每个比特都以概率e∈[0,1 /2]独立翻转。Alice和Bob想提取一个常见的随机字符串最小熵至少k。这项工作,我们建立沟通与成功概率权衡这个问题通过协议和一个匹配的下界(在商定的限制字符串是由爱丽丝的输入X)。具体地说,我们证明为了爱丽丝和鲍勃同意在一个常见的字符串的概率2 -γk (kγ≥1),最优通信(o (k)的条款,,对于大N可实现的)精确为[式],其中C:= 4e(1 - e)。特别是,实现Ω(1)协议概率的最优通信接近4e(1 - e)k。我们还考虑当Y是X上二进制擦除通道的输出时的情况,其中Y的每个位等于X的相应位,概率为1 - e,否则将被擦除(即用'?'替换)。在这种情况下,所需的通信变成[等式]。其中,最优通信达到Ω(1)协议概率的方法为ek,而不通信达到最优协议概率的方法为[方程]。我们的协议基于覆盖代码,并扩展了零通信情况下(Bogdanov和Mossel, 2011)的方法。下界依赖于超收缩不等式。对于位翻转模型,我们的论证通过允许通信扩展了(Bogdanov和Mossel, 2011)的方法;对于擦除模型,据我们所知,之前没有研究过所需的超收缩性陈述,它是由(Nair和Wang 2015)建立的(根据我们的应用)。我们还获得了这些任务的信息复杂度下界,并与我们的协议一起阐明了最近流行的Courtade和Kumar的“最具信息量的布尔函数”猜想。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信