最优熵约束量化器的码字数量是有限的还是无限的?

A. György, T. Linder, P. Chou, B. J. Betts
{"title":"最优熵约束量化器的码字数量是有限的还是无限的?","authors":"A. György, T. Linder, P. Chou, B. J. Betts","doi":"10.1109/TIT.2003.819340","DOIUrl":null,"url":null,"abstract":"An entropy-constrained quantizer Q is optimal if it minimizes the expected distortion D(Q) subject to a constraint on the output entropy H(Q). We use the Lagrangian formulation to show the existence and study the structure of optimal entropy-constrained quantizers that achieve a point on the lower convex hull of the operational distortion-rate function D/sub h/(R) = inf/sub Q/{D(Q) : H(Q) /spl les/ R}. In general, an optimal entropy-constrained quantizer may have a countably infinite number of codewords. Our main results show that if the tail of the source distribution is sufficiently light (resp., heavy) with respect to the distortion measure, the Lagrangian-optimal entropy-constrained quantizer has a finite (resp., infinite) number of codewords. In particular, for the squared error distortion measure, if the tail of the source distribution is lighter than the tail of a Gaussian distribution, then the Lagrangian-optimal quantizer has only a finite number of codewords, while if the tail is heavier than that of the Gaussian, the Lagrangian-optimal quantizer has an infinite number of codewords.","PeriodicalId":13250,"journal":{"name":"IEEE Trans. Inf. Theory","volume":"29 1","pages":"3031-3037"},"PeriodicalIF":0.0000,"publicationDate":"2003-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"34","resultStr":"{\"title\":\"Do optimal entropy-constrained quantizers have a finite or infinite number of codewords?\",\"authors\":\"A. György, T. Linder, P. Chou, B. J. Betts\",\"doi\":\"10.1109/TIT.2003.819340\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"An entropy-constrained quantizer Q is optimal if it minimizes the expected distortion D(Q) subject to a constraint on the output entropy H(Q). We use the Lagrangian formulation to show the existence and study the structure of optimal entropy-constrained quantizers that achieve a point on the lower convex hull of the operational distortion-rate function D/sub h/(R) = inf/sub Q/{D(Q) : H(Q) /spl les/ R}. In general, an optimal entropy-constrained quantizer may have a countably infinite number of codewords. Our main results show that if the tail of the source distribution is sufficiently light (resp., heavy) with respect to the distortion measure, the Lagrangian-optimal entropy-constrained quantizer has a finite (resp., infinite) number of codewords. In particular, for the squared error distortion measure, if the tail of the source distribution is lighter than the tail of a Gaussian distribution, then the Lagrangian-optimal quantizer has only a finite number of codewords, while if the tail is heavier than that of the Gaussian, the Lagrangian-optimal quantizer has an infinite number of codewords.\",\"PeriodicalId\":13250,\"journal\":{\"name\":\"IEEE Trans. Inf. Theory\",\"volume\":\"29 1\",\"pages\":\"3031-3037\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2003-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"34\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Trans. Inf. Theory\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/TIT.2003.819340\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Trans. Inf. Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TIT.2003.819340","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 34

摘要

熵约束量化器Q是最优的,如果它在输出熵H(Q)的约束下最小化预期失真D(Q)。我们利用拉格朗日公式证明了最优熵约束量化器的存在性,并研究了最优熵约束量化器的结构,这些量化器实现了操作失真率函数D/sub h/(R) = inf/sub Q/{D(Q): h (Q) /spl les/ R}的下凸包上的一个点。一般来说,一个最优的熵约束量化器可以有无数个码字。我们的主要结果表明,如果光源分布的尾部足够轻(例如:相对于失真度量,拉格朗日最优熵约束量化器具有有限的响应。(无限)码字数。特别是对于平方误差失真度量,如果源分布的尾部比高斯分布的尾部轻,则拉格朗日最优量化器只有有限个码字,而如果源分布的尾部比高斯分布的尾部重,则拉格朗日最优量化器有无限个码字。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Do optimal entropy-constrained quantizers have a finite or infinite number of codewords?
An entropy-constrained quantizer Q is optimal if it minimizes the expected distortion D(Q) subject to a constraint on the output entropy H(Q). We use the Lagrangian formulation to show the existence and study the structure of optimal entropy-constrained quantizers that achieve a point on the lower convex hull of the operational distortion-rate function D/sub h/(R) = inf/sub Q/{D(Q) : H(Q) /spl les/ R}. In general, an optimal entropy-constrained quantizer may have a countably infinite number of codewords. Our main results show that if the tail of the source distribution is sufficiently light (resp., heavy) with respect to the distortion measure, the Lagrangian-optimal entropy-constrained quantizer has a finite (resp., infinite) number of codewords. In particular, for the squared error distortion measure, if the tail of the source distribution is lighter than the tail of a Gaussian distribution, then the Lagrangian-optimal quantizer has only a finite number of codewords, while if the tail is heavier than that of the Gaussian, the Lagrangian-optimal quantizer has an infinite number of codewords.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信