Vector quantisation for wavelet based image compression

P. Fenwick, S. Woolford
{"title":"Vector quantisation for wavelet based image compression","authors":"P. Fenwick, S. Woolford","doi":"10.1109/DCC.1995.515575","DOIUrl":null,"url":null,"abstract":"Summary form only given. The present work arose from a need to transmit architectural line drawings over relatively slow communication links, such as telephone circuits. The images are mostly large line drawings, but with some shading. The application required good compression, incremental transmission, and excellent reproduction of sharp lines and fine detail such as text. The final system uses an initial wavelet transform stage (actually using a wave-packet transform), an adaptive vector quantiser stage, and a final post-compression stage. This paper emphasises the vector quantiser. Incremental transmission makes it desirable to use only actual data vectors in the database. The standard Linde Buzo Gray (LBG) algorithm was slow, taking 30-60 minutes for a training set, tended to use 'near-zero' vectors instead of 'true-zero' vectors introducing undesirable texture into the reconstructed image, and the quality could not be guaranteed with some images producing; artifacts at even low compression rates. The final vector quantiser uses new techniques with LRU maintenance of the database, updating for 'exact matches' to an existing vector and for 'near matches', using a combination of mean-square error and magnitude error. A conventional counting LRU mechanism is used, with different aging parameters for the two types of LRU update. The new vector quantiser requires about 10 seconds per image (compared with 30-60 minutes for LBG) and essentially eliminates the undesirable compression artifacts.","PeriodicalId":107017,"journal":{"name":"Proceedings DCC '95 Data Compression Conference","volume":"68 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1995-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings DCC '95 Data Compression Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DCC.1995.515575","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Summary form only given. The present work arose from a need to transmit architectural line drawings over relatively slow communication links, such as telephone circuits. The images are mostly large line drawings, but with some shading. The application required good compression, incremental transmission, and excellent reproduction of sharp lines and fine detail such as text. The final system uses an initial wavelet transform stage (actually using a wave-packet transform), an adaptive vector quantiser stage, and a final post-compression stage. This paper emphasises the vector quantiser. Incremental transmission makes it desirable to use only actual data vectors in the database. The standard Linde Buzo Gray (LBG) algorithm was slow, taking 30-60 minutes for a training set, tended to use 'near-zero' vectors instead of 'true-zero' vectors introducing undesirable texture into the reconstructed image, and the quality could not be guaranteed with some images producing; artifacts at even low compression rates. The final vector quantiser uses new techniques with LRU maintenance of the database, updating for 'exact matches' to an existing vector and for 'near matches', using a combination of mean-square error and magnitude error. A conventional counting LRU mechanism is used, with different aging parameters for the two types of LRU update. The new vector quantiser requires about 10 seconds per image (compared with 30-60 minutes for LBG) and essentially eliminates the undesirable compression artifacts.
基于小波的矢量量化图像压缩
只提供摘要形式。目前的工作是由于需要通过相对较慢的通信链路(如电话电路)传输建筑线条图而产生的。这些图像大多是大线条画,但有一些阴影。该应用程序需要良好的压缩、增量传输以及清晰的线条和精细的细节(如文本)的出色再现。最后的系统使用初始小波变换阶段(实际上使用波包变换),自适应矢量量化阶段和最后的后压缩阶段。本文着重讨论了矢量量子器。增量传输使得只使用数据库中的实际数据向量是可取的。标准的Linde Buzo Gray (LBG)算法速度慢,一个训练集需要30-60分钟,倾向于使用“近零”向量而不是“真零”向量,在重建图像中引入了不良的纹理,并且产生的图像质量无法保证;低压缩率下的工件。最后的矢量量化器使用了LRU维护数据库的新技术,使用均方误差和幅度误差的组合来更新现有矢量的“精确匹配”和“接近匹配”。采用传统的计数LRU机制,对两种LRU更新使用不同的老化参数。新的矢量量化器每张图像需要大约10秒(相比之下,LBG需要30-60分钟),并且基本上消除了不希望看到的压缩伪影。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信