A Comprehensive Study on Ziv-Zakai Lower Bounds on the MMSE

IF 2.2 3区 计算机科学 Q3 COMPUTER SCIENCE, INFORMATION SYSTEMS
Minoh Jeong;Alex Dytso;Martina Cardone
{"title":"A Comprehensive Study on Ziv-Zakai Lower Bounds on the MMSE","authors":"Minoh Jeong;Alex Dytso;Martina Cardone","doi":"10.1109/TIT.2025.3541987","DOIUrl":null,"url":null,"abstract":"This paper explores Bayesian lower bounds on the minimum mean squared error (MMSE) that belong to the well-known Ziv-Zakai family. The Ziv-Zakai technique relies on connecting the bound to an <inline-formula> <tex-math>$\\mathsf M$ </tex-math></inline-formula>-ary hypothesis testing problem. There are three versions of the Ziv-Zakai bound (ZZB): the first version relies on the so-called <italic>valley-filling function</i>, the second one is a relaxation of the first bound which omits the valley-filling function, and the third one, namely the single-point ZZB (SZZB), replaces the integration present in the first two bounds with a single point maximization. The first part of this paper focuses on providing the most general version of the bounds. It is shown that these bounds hold without any assumption on the distribution of the estimand. This makes the bounds applicable to discrete and mixed distributions. Then, the SZZB is extended to an <inline-formula> <tex-math>$\\mathsf M$ </tex-math></inline-formula>-ary setting and a version of it that holds for the multivariate setting is provided. In the second part, general properties of these bounds are provided. First, unlike the Bayesian <italic>Cramér-Rao bound</i>, it is shown that all the versions of the ZZB <italic>tensorize</i>. Second, a characterization of the <italic>high-noise</i> asymptotic is provided, which is used to argue about the tightness of the bounds. Third, a complete <italic>low-noise</i> asymptotic is provided under the assumptions of mixed-input distributions and Gaussian additive noise channels. In the low-noise, it is shown that the ZZB is generally tight, but there are examples for which the SZZB is not tight. In the third part, the tightness of the bounds is evaluated. First, it is shown that in the low-noise regime the ZZB without the valley-filling function, and, therefore, also the ZZB with the valley-filling function, are tight for mixed-input distributions and Gaussian additive noise channels. Second, for discrete inputs it is shown that the ZZB with the valley-filling function is always sub-optimal, and equal to zero without the valley-filling function. Third, unlike for the ZZB, an example is shown for which the SZZB is tight to the MMSE for discrete inputs. Fourth, sufficient and necessary conditions for the tightness of the bounds are provided. Finally, some examples are provided in which the bounds in the Ziv-Zakai family outperform other well-known Bayesian lower bounds, namely the Cramér-Rao bound and the maximum entropy bound.","PeriodicalId":13494,"journal":{"name":"IEEE Transactions on Information Theory","volume":"71 4","pages":"3214-3236"},"PeriodicalIF":2.2000,"publicationDate":"2025-02-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Information Theory","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10884827/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

Abstract

This paper explores Bayesian lower bounds on the minimum mean squared error (MMSE) that belong to the well-known Ziv-Zakai family. The Ziv-Zakai technique relies on connecting the bound to an $\mathsf M$ -ary hypothesis testing problem. There are three versions of the Ziv-Zakai bound (ZZB): the first version relies on the so-called valley-filling function, the second one is a relaxation of the first bound which omits the valley-filling function, and the third one, namely the single-point ZZB (SZZB), replaces the integration present in the first two bounds with a single point maximization. The first part of this paper focuses on providing the most general version of the bounds. It is shown that these bounds hold without any assumption on the distribution of the estimand. This makes the bounds applicable to discrete and mixed distributions. Then, the SZZB is extended to an $\mathsf M$ -ary setting and a version of it that holds for the multivariate setting is provided. In the second part, general properties of these bounds are provided. First, unlike the Bayesian Cramér-Rao bound, it is shown that all the versions of the ZZB tensorize. Second, a characterization of the high-noise asymptotic is provided, which is used to argue about the tightness of the bounds. Third, a complete low-noise asymptotic is provided under the assumptions of mixed-input distributions and Gaussian additive noise channels. In the low-noise, it is shown that the ZZB is generally tight, but there are examples for which the SZZB is not tight. In the third part, the tightness of the bounds is evaluated. First, it is shown that in the low-noise regime the ZZB without the valley-filling function, and, therefore, also the ZZB with the valley-filling function, are tight for mixed-input distributions and Gaussian additive noise channels. Second, for discrete inputs it is shown that the ZZB with the valley-filling function is always sub-optimal, and equal to zero without the valley-filling function. Third, unlike for the ZZB, an example is shown for which the SZZB is tight to the MMSE for discrete inputs. Fourth, sufficient and necessary conditions for the tightness of the bounds are provided. Finally, some examples are provided in which the bounds in the Ziv-Zakai family outperform other well-known Bayesian lower bounds, namely the Cramér-Rao bound and the maximum entropy bound.
MMSE的Ziv-Zakai下界的综合研究
本文探讨了贝叶斯最小均方误差(MMSE)的下界,它属于著名的Ziv-Zakai家族。Ziv-Zakai技术依赖于将边界连接到$\mathsf M$ -ary假设检验问题。Ziv-Zakai界(ZZB)有三种版本:第一种版本依赖于所谓的谷填充函数,第二种版本是第一种边界的松弛,省略了谷填充函数,第三种版本即单点ZZB (SZZB),用单点最大化取代了前两个边界中的积分。本文的第一部分着重于提供最一般的边界版本。结果表明,在没有对估计的分布作任何假设的情况下,这些边界是成立的。这使得边界适用于离散分布和混合分布。然后,将SZZB扩展为$\mathsf M$ -ary设置,并提供适用于多变量设置的SZZB版本。第二部分给出了这些边界的一般性质。首先,与贝叶斯cram - rao界不同,它证明了ZZB的所有版本都张化。其次,给出了高噪声渐近的一个表征,并用它来论证边界的紧密性。第三,在混合输入分布和高斯加性噪声信道的假设下,给出了一个完备的低噪声渐近解。在低噪声情况下,一般情况下ZZB是紧致的,但也有不紧致的情况。第三部分对边界的紧密性进行了计算。首先,研究表明,在低噪声条件下,不带谷填充函数的ZZB,以及带谷填充函数的ZZB,对于混合输入分布和高斯加性噪声信道都是紧致的。其次,对于离散输入,有谷填充函数的ZZB总是次优的,在没有谷填充函数的情况下ZZB等于零。第三,与ZZB不同的是,本文给出了一个例子,其中SZZB对离散输入的MMSE很紧。第四,给出了边界紧密性的充分必要条件。最后,给出了一些例子,在这些例子中,Ziv-Zakai族的边界优于其他著名的贝叶斯下界,即cram - rao界和最大熵界。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory 工程技术-工程:电子与电气
CiteScore
5.70
自引率
20.00%
发文量
514
审稿时长
12 months
期刊介绍: The IEEE Transactions on Information Theory is a journal that publishes theoretical and experimental papers concerned with the transmission, processing, and utilization of information. The boundaries of acceptable subject matter are intentionally not sharply delimited. Rather, it is hoped that as the focus of research activity changes, a flexible policy will permit this Transactions to follow suit. Current appropriate topics are best reflected by recent Tables of Contents; they are summarized in the titles of editorial areas that appear on the inside front cover.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信