2009 Data Compression Conference最新文献

筛选
英文 中文
Guaranteed Synchronization of Huffman Codes with Known Position of Decoder 解码器位置已知时霍夫曼码的保证同步
2009 Data Compression Conference Pub Date : 2009-03-16 DOI: 10.1109/DCC.2009.18
M. Biskup, Wojciech Plandowski
{"title":"Guaranteed Synchronization of Huffman Codes with Known Position of Decoder","authors":"M. Biskup, Wojciech Plandowski","doi":"10.1109/DCC.2009.18","DOIUrl":"https://doi.org/10.1109/DCC.2009.18","url":null,"abstract":"In Huffman-encoded data a bit error may propagate arbitrarily long. This paper introduces a method for limiting such error propagation to at most $L$ bits, $L$ being a parameter. It is required that the decoder knows the bit number currently being decoded. The method utilizes the inherent tendency of Huffman codes to resynchronize spontaneously and does not introduce any redundancy if such a~resynchronization takes place. The method is applied to parallel decoding of Huffman data and is tested on Jpeg compression.","PeriodicalId":377880,"journal":{"name":"2009 Data Compression Conference","volume":"47 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124004516","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Implementation of an Incremental MDL-Based Two Part Compression Algorithm for Model Inference 基于增量mdl的模型推理两部分压缩算法的实现
2009 Data Compression Conference Pub Date : 2009-03-16 DOI: 10.1109/DCC.2009.66
T. S. Markham, S. Evans, J. Impson, E. Steinbrecher
{"title":"Implementation of an Incremental MDL-Based Two Part Compression Algorithm for Model Inference","authors":"T. S. Markham, S. Evans, J. Impson, E. Steinbrecher","doi":"10.1109/DCC.2009.66","DOIUrl":"https://doi.org/10.1109/DCC.2009.66","url":null,"abstract":"We describe the implementation and performance of a compression-based model inference engine, MDLcompress. The MDL-based compression produces a two part code of the training data, with the model portion of the code being used to compress and classify test data. We present pseudo-code of the algorithms for model generation and explore the conflicting requirements between minimizing grammar size and minimizing descriptive cost. We show results of a MDL model-based classification system for network traffic anomaly detection.","PeriodicalId":377880,"journal":{"name":"2009 Data Compression Conference","volume":"109 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127079345","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Low-Complexity Joint Source/Channel Turbo Decoding of Arithmetic Codes with Image Transmission Application 低复杂度联合源信道Turbo译码在图像传输中的应用
2009 Data Compression Conference Pub Date : 2009-03-16 DOI: 10.1109/DCC.2009.31
Amin Zribi, S. Zaibi, R. Pyndiah, A. Bouallègue
{"title":"Low-Complexity Joint Source/Channel Turbo Decoding of Arithmetic Codes with Image Transmission Application","authors":"Amin Zribi, S. Zaibi, R. Pyndiah, A. Bouallègue","doi":"10.1109/DCC.2009.31","DOIUrl":"https://doi.org/10.1109/DCC.2009.31","url":null,"abstract":"In this paper a novel joint source channel (JSC) decoding technique is presented. The proposed approach enables iterative decoding for serially concatenated arithmetic codes and convolutional codes. Iterations are performed between Soft In Soft Out (SISO) component decoders. For arithmetic decoding, we proposed to employ a low complex trellis search technique to estimate the best transmitted codewords and generate soft outputs. Performance of the presented system are evaluated in terms of PER, in the case of transmission across the AWGN channel. Simulation results show that the proposed JSC iterative scheme leads to significant gain in comparison with a traditional separated decoding. Finally, the practical relevance of the proposed technique is validated under an image transmission system using the SPIHT codec.","PeriodicalId":377880,"journal":{"name":"2009 Data Compression Conference","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122269881","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Multi Level Multiple Descriptions 多级别多描述
2009 Data Compression Conference Pub Date : 2009-03-16 DOI: 10.1109/DCC.2009.49
T. A. Beery, R. Zamir
{"title":"Multi Level Multiple Descriptions","authors":"T. A. Beery, R. Zamir","doi":"10.1109/DCC.2009.49","DOIUrl":"https://doi.org/10.1109/DCC.2009.49","url":null,"abstract":"Multiple Description (MD) source coding is a method to overcome unexpected information loss in a diversity system such as the internet, or a wireless network. While classic MD coding handles the situation where the rate in some channels drops to zero temporarily,thus causing unexpected packet-loss, it fails to accommodate more subtle changes in link rate such as rate reduction. In such a case, a classic scheme can’t use the link capacity left for information transfer, causing even minor rate reduction to be considered as link failure.In order to accommodate such a frequent situation, we propose a more modular design for transmitting over a diversity system, which can handle unexpected reduction in link's rate, by downgrading the original description into a more coarse description, so it would fit to the new link’s rate. The method is analyzed theoretically, and performance results are presented.","PeriodicalId":377880,"journal":{"name":"2009 Data Compression Conference","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129752191","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
On Compression of Data Encrypted with Block Ciphers 分组密码加密数据的压缩研究
2009 Data Compression Conference Pub Date : 2009-03-16 DOI: 10.1109/DCC.2009.71
D. Klinc, Carmit Hazay, A. Jagmohan, H. Krawczyk, T. Rabin
{"title":"On Compression of Data Encrypted with Block Ciphers","authors":"D. Klinc, Carmit Hazay, A. Jagmohan, H. Krawczyk, T. Rabin","doi":"10.1109/DCC.2009.71","DOIUrl":"https://doi.org/10.1109/DCC.2009.71","url":null,"abstract":"This paper investigates compression of encrypted data. It has been previously shown that data encrypted with Vernam's scheme, also known as the one-time pad, can be compressed without knowledge of the secret key, therefore this result can be applied to stream ciphers used in practice. However, it was not known how to compress data encrypted with non-stream ciphers. In this paper, we address the problem of  compressing data encrypted with block ciphers, such as the Advanced Encryption Standard (AES) used in conjunction with one of the commonly employed chaining modes.  We show that such data can be feasibly compressed without knowledge of the key. We present performance results for practical code constructions used to compress binary sources.","PeriodicalId":377880,"journal":{"name":"2009 Data Compression Conference","volume":"171 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134292756","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 86
Flexible Predictions Selection for Multi-view Video Coding 多视点视频编码的灵活预测选择
2009 Data Compression Conference Pub Date : 2009-03-16 DOI: 10.1109/DCC.2009.35
F. Zhao, Guizhong Liu, Feifei Ren, N. Zhang
{"title":"Flexible Predictions Selection for Multi-view Video Coding","authors":"F. Zhao, Guizhong Liu, Feifei Ren, N. Zhang","doi":"10.1109/DCC.2009.35","DOIUrl":"https://doi.org/10.1109/DCC.2009.35","url":null,"abstract":"Even though the fixed HHI’s (Fraunhofer Heinrich-Hertz-Institute) scheme for multi-view video coding can get very good performance by fully utilizing the predictions in both the temporal and view directions, the complexity of this inter-prediction is very high. This paper presents some techniques to reduce the complexity while maintaining the coding performance.","PeriodicalId":377880,"journal":{"name":"2009 Data Compression Conference","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132189772","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Fast 15x15 Transform for Image and Video Coding Applications 快速15x15变换图像和视频编码应用程序
2009 Data Compression Conference Pub Date : 2009-03-16 DOI: 10.1109/DCC.2009.81
Y. Reznik, R. Chivukula
{"title":"Fast 15x15 Transform for Image and Video Coding Applications","authors":"Y. Reznik, R. Chivukula","doi":"10.1109/DCC.2009.81","DOIUrl":"https://doi.org/10.1109/DCC.2009.81","url":null,"abstract":"We derive factorization of DCT-II transform of size 15 which requires only 14 multiplications, 67 additions, and 3 multiplications by rational dyadic constants (implementable by shifts). This transform is significantly less complex than DCT-II of nearest dyadic size (16), and we suggest considering it for future image and video coding applications that can benefit from using larger (than 8x8) block sizes.","PeriodicalId":377880,"journal":{"name":"2009 Data Compression Conference","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130778406","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
On Minimum-Redundancy Fix-Free Codes 关于最小冗余无固定码
2009 Data Compression Conference Pub Date : 2009-03-16 DOI: 10.1109/DCC.2009.39
S. Savari
{"title":"On Minimum-Redundancy Fix-Free Codes","authors":"S. Savari","doi":"10.1109/DCC.2009.39","DOIUrl":"https://doi.org/10.1109/DCC.2009.39","url":null,"abstract":"Fix-free codes are variable length codes in which no codeword is the prefix or suffix of another codeword.  They are used in video compression standards because their property of efficient decoding in both the forward and backward directions assists with error resilience.  This property also potentially halves the average search time for a string in a compressed file relative to unidirectional variable length codes.  Relatively little is known about minimum-redundancy fix-free codes, and we describe some characteristics of and observations about such codes. We introduce a new heuristic to produce fix-free codes which is influenced by these ideas. The design of minimum-redundancy fix-free codes is an example of a constraint processing problem, and we offer the first approach to constructing them and a variation with an additional symmetry requirement.","PeriodicalId":377880,"journal":{"name":"2009 Data Compression Conference","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121168147","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
Universal Refinable Trellis Coded Quantization 通用可细化网格编码量化
2009 Data Compression Conference Pub Date : 2009-03-16 DOI: 10.1109/DCC.2009.16
S. Steger, T. Richter
{"title":"Universal Refinable Trellis Coded Quantization","authors":"S. Steger, T. Richter","doi":"10.1109/DCC.2009.16","DOIUrl":"https://doi.org/10.1109/DCC.2009.16","url":null,"abstract":"We introduce a novel universal refinable trellis quantization scheme (URTCQ) that is suitable for bitplane coding with many reconstruction stages. Existing refinable trellis quantizers either require excessive codebook training and are outperformed by scalar quantization for more than two stages (MS-TCQ, E-TCQ), require a huge computational burden (SR-TCQ) or achieve a good rate distortion performance in the last stage only (UTCQ). The presented quantization technique is a mixture of a scalar quantizer and an improved version of the E-TCQ. For all supported sources only one time training to an i.i.d. uniform source is required and its incremental bitrate is not more than 1 bps for each stage. The complexity is proportional to the number of stages and the number of trellis states. We compare the rate distortion performance of our work on generalized Gaussian i.i.d. sources with the quantizers deployed in JPEG2000 (USDZQ, UTCQ). It turns out that it is in no stage worse than the scalar quantizer and usually outperforms the UTCQ except for the last stage.","PeriodicalId":377880,"journal":{"name":"2009 Data Compression Conference","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126729256","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Wavelet Image Two-Line Coder for Wireless Sensor Node with Extremely Little RAM 无线传感器节点小波图像双线编码器
2009 Data Compression Conference Pub Date : 2009-03-16 DOI: 10.1109/DCC.2009.30
Stephan Rein, Stephan Lehmann, C. Gühmann
{"title":"Wavelet Image Two-Line Coder for Wireless Sensor Node with Extremely Little RAM","authors":"Stephan Rein, Stephan Lehmann, C. Gühmann","doi":"10.1109/DCC.2009.30","DOIUrl":"https://doi.org/10.1109/DCC.2009.30","url":null,"abstract":"This paper gives a novel wavelet image two-line (Wi2l) coder that is designed to fulfill the memory constraints of a typical wireless sensor node. The algorithm operates line-wisely on picture data stored on the sensor's flash memory card while it requires approximatively 1.5 kByte RAM to compress a monochrome picture with the size of 256x256 Bytes. The achieved data compression rates are the same as with the set partitioning in hierarchical trees (Spiht) algorithm. The coder works recursively on two lines of a wavelet subband while intermediate data of these lines is stored to backward encode the wavelet trees. Thus it does not need any list but three small buffers with a fixed dimension. The compression performance is evaluated by a PC-implementation in C, while time measurements are conducted on a typical wireless sensor node using a modified version of the PC-code.","PeriodicalId":377880,"journal":{"name":"2009 Data Compression Conference","volume":"130 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123899771","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 14
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信