Centralized and Distributed Lossy Source Coding of Densely Sampled Gaussian Data, with and without Transforms

D. Neuhoff, S. Pradhan
{"title":"Centralized and Distributed Lossy Source Coding of Densely Sampled Gaussian Data, with and without Transforms","authors":"D. Neuhoff, S. Pradhan","doi":"10.1109/ITA.2007.4357596","DOIUrl":null,"url":null,"abstract":"With mean-squared error D as a goal, it is well known that one may approach the rate-distortion function R(D) of a spatially nonbandlimited, time IID, continuous- space, discrete-time Gaussian source by spatially sampling at a sufficiently high rate, applying the Karhunen-Loeve transform to sufficiently long blocks, and independently coding transform coefficients of each type at the first-order rate-distortion function of that type, with a distortion target chosen appropriately for that type. This paper compares and contrasts this classical result with several recently explored alternative schemes for encoding source samples taken at a high rate. The first scheme, which scalar quantizes the samples and then losslessly encodes the quantized samples at their entropy-rate, is known to have rate approaching infinity when distortion is held at D. Is such catastrophic behavior due to the scalar quantizer or to the distributed nature of the quantization? Recent results show that even without a transform, but with distributed vector quantization, it is possible to attain performance that differs from the rate-distortion function by only a finite constant. This suggests it was the scalar quantizer that caused the catastrophic behavior. The final recent result suggests the situation is more nuanced, because it shows that if in the classical scheme scalar quantizers with entropy coding replace the ideal coding of the coefficients at their first-order rate-distortion functions, then again performance differs from the rate-distortion function by a finite constant.","PeriodicalId":439952,"journal":{"name":"2007 Information Theory and Applications Workshop","volume":"242 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2007-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2007 Information Theory and Applications Workshop","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITA.2007.4357596","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

With mean-squared error D as a goal, it is well known that one may approach the rate-distortion function R(D) of a spatially nonbandlimited, time IID, continuous- space, discrete-time Gaussian source by spatially sampling at a sufficiently high rate, applying the Karhunen-Loeve transform to sufficiently long blocks, and independently coding transform coefficients of each type at the first-order rate-distortion function of that type, with a distortion target chosen appropriately for that type. This paper compares and contrasts this classical result with several recently explored alternative schemes for encoding source samples taken at a high rate. The first scheme, which scalar quantizes the samples and then losslessly encodes the quantized samples at their entropy-rate, is known to have rate approaching infinity when distortion is held at D. Is such catastrophic behavior due to the scalar quantizer or to the distributed nature of the quantization? Recent results show that even without a transform, but with distributed vector quantization, it is possible to attain performance that differs from the rate-distortion function by only a finite constant. This suggests it was the scalar quantizer that caused the catastrophic behavior. The final recent result suggests the situation is more nuanced, because it shows that if in the classical scheme scalar quantizers with entropy coding replace the ideal coding of the coefficients at their first-order rate-distortion functions, then again performance differs from the rate-distortion function by a finite constant.
密集采样高斯数据的集中和分布式有损源编码,带和不带变换
以均方误差D为目标,众所周知,可以通过以足够高的速率进行空间采样,对足够长的块应用Karhunen-Loeve变换,并在该类型的一阶率失真函数中独立编码每种类型的变换系数,并为该类型选择适当的失真目标,来接近空间非带限、时间IID、连续空间、离散时间高斯源的率失真函数R(D)。本文将这一经典结果与最近探索的几种用于编码高速率源样本的替代方案进行比较和对比。第一种方案,即标量量化样本,然后以其熵率对量化后的样本进行无损编码,已知当失真保持在d时,其速率接近无穷大。这种灾难性的行为是由于标量量化器还是量化的分布式性质?最近的结果表明,即使没有变换,但使用分布式矢量量化,也可以获得与率失真函数仅相差有限常数的性能。这表明是标量量化器导致了灾难性的行为。最近的最终结果表明,情况更加微妙,因为它表明,如果在经典方案中,熵编码的标量量化器在其一阶率失真函数中取代系数的理想编码,那么性能再次与率失真函数相差一个有限常数。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信