Proceedings DCC 2002. Data Compression Conference最新文献

筛选
英文 中文
A wavelet based low complexity embedded block coding algorithm 一种基于小波的低复杂度嵌入式分组编码算法
Proceedings DCC 2002. Data Compression Conference Pub Date : 2002-04-02 DOI: 10.1109/DCC.2002.999995
B. Das, S. Banerjee
{"title":"A wavelet based low complexity embedded block coding algorithm","authors":"B. Das, S. Banerjee","doi":"10.1109/DCC.2002.999995","DOIUrl":"https://doi.org/10.1109/DCC.2002.999995","url":null,"abstract":"Summary form only given. Along with compression efficiency, other factors, like complexity, are significant issues for image coding. The measure of complexity varies from application to application. To overcome the problems of large database maintenance and the high computational burden of EZT and SPIHT, a new algorithm, WEBLOC (Wavelet-based Embedded BLOck Coding), is proposed for low complexity, near lossless compression. The most significant characteristics of this algorithm involve (a) sign-bit arrangement (b) subband intensity distribution statistics. Results were obtained with monochrome 8 bpp, 256/spl times/256 images. The salient features of the coding algorithm can be summarized as follows. Entropy coding is replaced by fixed RLC (runlength coding), considerably reducing the computational overhead and also the time overhead. For near lossless image compression, the reduction of complexity highly reduces the hardware circuitry. The running memory overhead for any list is reduced as compared to the EZT and SPIHT. However, the memory requirement for storing the wavelet coefficients is not reduced.","PeriodicalId":420897,"journal":{"name":"Proceedings DCC 2002. Data Compression Conference","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121796165","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A method for compressing lexicons 一种压缩词汇的方法
Proceedings DCC 2002. Data Compression Conference Pub Date : 2002-04-02 DOI: 10.1109/DCC.2002.1000013
S. Ristov, Eric Guy Claude Laporte
{"title":"A method for compressing lexicons","authors":"S. Ristov, Eric Guy Claude Laporte","doi":"10.1109/DCC.2002.1000013","DOIUrl":"https://doi.org/10.1109/DCC.2002.1000013","url":null,"abstract":"Summary form only given. Lexicon lookup is an essential part of almost every natural language processing system. A natural language lexicon is a set of strings where each string consists of a word and the associated linguistic data. Its computer representation is a structure that returns appropriate linguistic data on a given input word. It should be small and fast. We propose a method for lexicon compression based on a very efficient trie compression method and the inverted file paradigm. The method was applied on a 664000 string, 18 Mbyte, French phonetic and grammatical electronic dictionary for spelling-to-phonetics conversion. Entries in the lexicon are strings consisting of a word, its phonetic transcription, and some additional codes.","PeriodicalId":420897,"journal":{"name":"Proceedings DCC 2002. Data Compression Conference","volume":"75 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121040236","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Fast peak autocorrelation finding for periodicity-exploiting compression methods 周期利用压缩方法的快速峰值自相关发现
Proceedings DCC 2002. Data Compression Conference Pub Date : 2002-04-02 DOI: 10.1109/DCC.2002.999994
C. Constantinescu, R. Arps
{"title":"Fast peak autocorrelation finding for periodicity-exploiting compression methods","authors":"C. Constantinescu, R. Arps","doi":"10.1109/DCC.2002.999994","DOIUrl":"https://doi.org/10.1109/DCC.2002.999994","url":null,"abstract":"Summary form only given. Bilevel image compression algorithms like JBIG, JBIG2-Generic, and PRECIS can exploit 1D or 2D peak autocorrelation in binary images like 'digital halftones', in order to achieve breakthrough boosts in additional compression. For hard to compress, but periodic halftones, boosts of factors of three or more times the compression ratios and similar increases in decompression speeds can be achieved (boosts defined v.s. the closest related, non-periodicity-exploiting algorithm (e.g. JBIG or JBIG2-Generic with AT>0 v.s. with AT=O, or PRECIS v.s. MMR)). Our peak autocorrelation finding method isolates the needed period for peak compression, two orders-of-magnitude faster than the prior art.","PeriodicalId":420897,"journal":{"name":"Proceedings DCC 2002. Data Compression Conference","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116261701","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
PPMexe: PPM for compressing software PPMexe:压缩软件的PPM
Proceedings DCC 2002. Data Compression Conference Pub Date : 2002-04-02 DOI: 10.1109/DCC.2002.999957
M. Drinic, D. Kirovski
{"title":"PPMexe: PPM for compressing software","authors":"M. Drinic, D. Kirovski","doi":"10.1109/DCC.2002.999957","DOIUrl":"https://doi.org/10.1109/DCC.2002.999957","url":null,"abstract":"With the emergence of software delivery platforms such as Microsoft's .NET, code compression has become one of the core enabling technologies strongly affecting system performance. We present PPMexe - a set of compression mechanisms for executables that explores their syntax and semantics to achieve superior compression rates. The fundament of PPMexe is the generic paradigm of prediction by partial matching (PPM). We combine PPM with two pre-processing steps: instruction rescheduling to improve prediction rates and partitioning of a program binary into streams with high auto-correlation. We improve the traditional PPM algorithm by using: an additional alphabet of frequent variable-length super-symbols extracted from the input stream of fixed-length symbols and a low-overhead mechanism that enables decompression starting from an arbitrary instruction of the executable, a feature pivotal for run-time software delivery. PPMexe was implemented for x86 binaries and tested on several large Microsoft applications. Binaries compressed using PPMexe were 16-23% smaller than files created using PPMD, the best available compressor.","PeriodicalId":420897,"journal":{"name":"Proceedings DCC 2002. Data Compression Conference","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115556629","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
Rate-based versus distortion-based optimal joint source-channel coding 基于速率与基于失真的最优联合信信道编码
Proceedings DCC 2002. Data Compression Conference Pub Date : 2002-04-02 DOI: 10.1109/dcc.2002.999944
R. Hamzaoui, V. Stanković
{"title":"Rate-based versus distortion-based optimal joint source-channel coding","authors":"R. Hamzaoui, V. Stanković","doi":"10.1109/dcc.2002.999944","DOIUrl":"https://doi.org/10.1109/dcc.2002.999944","url":null,"abstract":"We consider a joint source-channel coding system that protects an embedded wavelet bitstream against noise using a finite family of channel codes with error detection and error correction capability. The performance of this system may be measured by the expected distortion or by the expected number of correctly received source bits subject to a target total transmission rate. Whereas a rate-based optimal solution can be found in linear time, the computation of a distortion-based optimal solution is prohibitive. Under the assumption of the convexity of the operational distortion-rate function of the source coder, we give a lower bound on the expected distortion of a distortion-based optimal solution that depends only on a rate-based optimal solution. Then we show that a distortion-based optimal solution provides a stronger error protection than a rate-based optimal solution and exploit this result to reduce the time complexity of the distortion-based optimization. Finally, we propose a fast iterative improvement algorithm that starts from a rate-based optimal solution and converges to a local minimum of the expected distortion. Experimental results for a binary symmetric channel with the SPIHT coder and JPEG 2000 show that our lower bound is close to optimal. Moreover, the solution given by our local search algorithm has about the same quality as a distortion-based optimal solution, whereas its complexity is much lower than that of the previous best solution.","PeriodicalId":420897,"journal":{"name":"Proceedings DCC 2002. Data Compression Conference","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130284630","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 59
Quantization as histogram segmentation: globally optimal scalar quantizer design in network systems 作为直方图分割的量化:网络系统中全局最优标量量化器设计
Proceedings DCC 2002. Data Compression Conference Pub Date : 2002-04-02 DOI: 10.1109/DCC.2002.999968
D. Muresan, M. Effros
{"title":"Quantization as histogram segmentation: globally optimal scalar quantizer design in network systems","authors":"D. Muresan, M. Effros","doi":"10.1109/DCC.2002.999968","DOIUrl":"https://doi.org/10.1109/DCC.2002.999968","url":null,"abstract":"We propose a polynomial-time algorithm for optimal scalar quantizer design on discrete-alphabet sources. Special cases of the proposed approach yield optimal design algorithms for fixed-rate and entropy-constrained scalar quantizers, multi-resolution scalar quantizers, multiple description scalar quantizers, and Wyner-Ziv scalar quantizers. The algorithm guarantees globally optimal solutions for fixed-rate and entropy-constrained scalar quantizers and constrained optima for the other coding scenarios. We derive the algorithm by demonstrating the connection between scalar quantization, histogram segmentation, and the shortest path problem in a certain directed acyclic graph.","PeriodicalId":420897,"journal":{"name":"Proceedings DCC 2002. Data Compression Conference","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130049933","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 42
Index compression through document reordering 通过文档重新排序来压缩索引
Proceedings DCC 2002. Data Compression Conference Pub Date : 2002-04-02 DOI: 10.1109/DCC.2002.999972
Daniel K. Blandford, G. Blelloch
{"title":"Index compression through document reordering","authors":"Daniel K. Blandford, G. Blelloch","doi":"10.1109/DCC.2002.999972","DOIUrl":"https://doi.org/10.1109/DCC.2002.999972","url":null,"abstract":"An important concern in the design of search engines is the construction of an inverted index. An inverted index, also called a concordance, contains a list of documents (or posting list) for every possible search term. These posting lists are usually compressed with difference coding. Difference coding yields the best compression when the lists to be coded have high locality. Coding methods have been designed to specifically take advantage of locality in inverted indices. Here, we describe an algorithm to permute the document numbers so as to create locality in an inverted index. This is done by clustering the documents. Our algorithm, when applied to the TREC ad hoc database (disks 4 and 5), improves the performance of the best difference coding algorithm we found by fourteen percent. The improvement increases as the size of the index increases, so we expect that greater improvements would be possible on larger datasets.","PeriodicalId":420897,"journal":{"name":"Proceedings DCC 2002. Data Compression Conference","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125390733","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 136
MPEG-7 binary format for XML data XML数据的MPEG-7二进制格式
Proceedings DCC 2002. Data Compression Conference Pub Date : 2002-04-02 DOI: 10.1109/DCC.2002.1000010
U. Niedermeier, J. Heuer, A. Hutter, W. Stechele
{"title":"MPEG-7 binary format for XML data","authors":"U. Niedermeier, J. Heuer, A. Hutter, W. Stechele","doi":"10.1109/DCC.2002.1000010","DOIUrl":"https://doi.org/10.1109/DCC.2002.1000010","url":null,"abstract":"Summary form only given. For the MPEG-7 standard, a binary format for the encoding of XML data has been developed that meets a set of requirements that was derived from a wide range of targeted applications. The resulting key features of the binary format are: high data compression (up to 98% for the document structure), provision of streaming, dynamic update of the document structure, random order of transmission of XML elements as well as fast random access of data entities in the compressed stream. To provide these functionalities, a novel, schema-aware approach was taken that exploits the knowledge of standardized MPEG-7 schema. The XML schema definition is used to assign codes to the individual children of an XML element. These codes are signalled in binary format to select nodes in the XML description tree. The binary format bit stream is organized as a sequence of access units. Each access unit can be decoded independently and contains information about a fragment of the description (fragment payload) and where to place the fragment in the current tree (context path). Compared to the standard text compressor ZIP, or the XML-optimized tool XMill, the MPEG-7 binary format achieves a 2-5 times better compression of the document structure and provides additional functionalities. These increase the flexibility and make it useful in broadcast applications and scenarios with limited bandwidth.","PeriodicalId":420897,"journal":{"name":"Proceedings DCC 2002. Data Compression Conference","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117055187","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Adaptive parametric vector quantization by natural type selection 基于自然类型选择的自适应参数矢量量化
Proceedings DCC 2002. Data Compression Conference Pub Date : 2002-04-02 DOI: 10.1109/DCC.2002.999979
Y. Kochman, R. Zamir
{"title":"Adaptive parametric vector quantization by natural type selection","authors":"Y. Kochman, R. Zamir","doi":"10.1109/DCC.2002.999979","DOIUrl":"https://doi.org/10.1109/DCC.2002.999979","url":null,"abstract":"We present a new adaptive mechanism for empirical \"on-line\" design of a vector quantizer codebook. The proposed scheme is based on the principle of \"natural type selection\" (NTS) (Zamir and Rose, 2001). The NTS principle implies that backward adaptation, i.e., adaptation directed by the past reconstruction rather than by the uncoded source sequence converges to an optimum rate-distortion codebook. We incorporate the NTS iteration step into a parametric encoder. We demonstrate that the codebook converges to an optimum rate-distortion solution within the associated parametric class. This new scheme does not suffer from the severe complexity at high dimensions of nonparametric solutions like the generalized Lloyd algorithm (GLA). Moreover, unlike existing parametric adaptive schemes (e.g., code-excited linear prediction (CELP)), this scheme is optimal even for low coding rates.","PeriodicalId":420897,"journal":{"name":"Proceedings DCC 2002. Data Compression Conference","volume":"106 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122643351","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 14
On coding of sources with two-sided geometric distribution using binary decomposition 双面几何分布源的二值分解编码
Proceedings DCC 2002. Data Compression Conference Pub Date : 2002-04-02 DOI: 10.1109/DCC.2002.1000002
A. Krivoulets
{"title":"On coding of sources with two-sided geometric distribution using binary decomposition","authors":"A. Krivoulets","doi":"10.1109/DCC.2002.1000002","DOIUrl":"https://doi.org/10.1109/DCC.2002.1000002","url":null,"abstract":"Summary form only given. We address the problem of entropy coding of integers i/spl isin/Z with a probability distribution defined as the two-sided geometric distribution (TSGD) which arises mainly in tasks of image and video compression. An efficient method based on binary tree decomposition of the source alphabet, combined with binary arithmetic coding, was proposed for coding of DC and AC coefficients of the DCT in the JPEG image compression standard. Binary decomposition allows for efficient coding of sources with large alphabets and skewed distribution. We propose two binary decompositions for coding of sources with the TSGD.","PeriodicalId":420897,"journal":{"name":"Proceedings DCC 2002. Data Compression Conference","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117091164","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信