2010 Data Compression Conference最新文献

筛选
英文 中文
Lossless Compression of Mapped Domain Linear Prediction Residual for ITU-T Recommendation G.711.0 ITU-T G.711.0建议的映射域线性预测残差无损压缩
2010 Data Compression Conference Pub Date : 2010-03-24 DOI: 10.1109/DCC.2010.69
N. Harada, Y. Kamamoto, T. Moriya
{"title":"Lossless Compression of Mapped Domain Linear Prediction Residual for ITU-T Recommendation G.711.0","authors":"N. Harada, Y. Kamamoto, T. Moriya","doi":"10.1109/DCC.2010.69","DOIUrl":"https://doi.org/10.1109/DCC.2010.69","url":null,"abstract":"ITU-T Rec. G.711 is widely used for the narrow band speech communication. ITU-T has just established a very low complexity and efficient lossless coding standard for G.711, called G.711.0 - Lossless compression of G.711 pulse code modulation. This paper introduces some coding technologies newly proposed and applied to the G.711.0 codec, such as Plus-Minus zero mapping for the mapped domain linear predictive coding and escaped-Huffman coding combined with adaptive recursive Rice coding for lossless compression of the prediction residual. Performance test results for those coding tools are shown in comparison with the results for the conventional technology. The performance is measured based on the figure of merit (FoM), which is a function of the trade-off between compression performance and computational complexity. The proposed tools improve the compression performance by 0.16% in total while keeping the computational complexity of encoder/decoder pair low (about 1.0 WMOPS in average and 1.667 WMOPS in the worst-case).","PeriodicalId":299459,"journal":{"name":"2010 Data Compression Conference","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115324339","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
A SAT-Based Scheme to Determine Optimal Fix-Free Codes 一种基于sat的最优无固定码确定方案
2010 Data Compression Conference Pub Date : 2010-03-24 DOI: 10.1109/DCC.2010.22
Navid Abedini, S. Khatri, S. Savari
{"title":"A SAT-Based Scheme to Determine Optimal Fix-Free Codes","authors":"Navid Abedini, S. Khatri, S. Savari","doi":"10.1109/DCC.2010.22","DOIUrl":"https://doi.org/10.1109/DCC.2010.22","url":null,"abstract":"Fix-free or reversible-variable-length codes are prefix condition codes which can also be decoded in the reverse direction. They have attracted attention from several communities and are used in video standards. Two variations of fix-free codes (with additional constraints) have also been considered for joint source-channel coding: 1) \"symmetric\" fix-free codes, which require the codewords to be palindromes; 2) fix-free codes with distance constraints on pairs of codewords. We propose a new approach to determine the existence of a fix-free code with a given set of codeword lengths, for each of the three variations of the problem. We also describe a branch-and-bound algorithm to find the collection of optimal codes for asymmetric and symmetric fix-free codes.","PeriodicalId":299459,"journal":{"name":"2010 Data Compression Conference","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127282881","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
Low-Complexity PARCOR Coefficient Quantizer and Prediction Order Estimator for G.711.0 (Lossless Speech Coding) G.711.0无损语音编码的低复杂度PARCOR系数量化器和预测阶估计器
2010 Data Compression Conference Pub Date : 2010-03-24 DOI: 10.1109/DCC.2010.49
Y. Kamamoto, T. Moriya, N. Harada
{"title":"Low-Complexity PARCOR Coefficient Quantizer and Prediction Order Estimator for G.711.0 (Lossless Speech Coding)","authors":"Y. Kamamoto, T. Moriya, N. Harada","doi":"10.1109/DCC.2010.49","DOIUrl":"https://doi.org/10.1109/DCC.2010.49","url":null,"abstract":"This paper presents two low-complexity tools used for the new ITU-T recommendation G.711.0, which is the standard for lossless compression of G.711 (A-law/Mu-law logarithmic PCM) speech data. One is an algorithm for quantizing the PARCOR/reflection coefficients and the other is an estimation method for the optimal prediction order. Both tools are based on a criterion that minimizes the entropy of the prediction residual signals and can be implemented in a fixed-point low-complexity algorithm. G.711.0 with the developed practical tools will be widely used everywhere because it can losslessly reduce the data rate of G.711, the prevailing speech-coding technology.","PeriodicalId":299459,"journal":{"name":"2010 Data Compression Conference","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115327699","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
LDPC Codes for Information Embedding and Lossy Distributed Source Coding LDPC码的信息嵌入和有损分布式源编码
2010 Data Compression Conference Pub Date : 2010-03-24 DOI: 10.1109/DCC.2010.87
Mina Sartipi
{"title":"LDPC Codes for Information Embedding and Lossy Distributed Source Coding","authors":"Mina Sartipi","doi":"10.1109/DCC.2010.87","DOIUrl":"https://doi.org/10.1109/DCC.2010.87","url":null,"abstract":"Inspired by our recent work on lossy distributed source coding with side information available at the decoder, we propose a practical scheme for information embedding system with side information available at the encoder. Our proposed scheme is based on sending parity bits using LDPC codes. We provide a design procedure for the LDPC code that guarantees performance close to the Gelfand-Pinsker and Wyner-Ziv limits. Using simulation results, we show that the proposed method performs close to both Wyner-Ziv and Gelfand-Pinsker theoretical limits for even short length codes.","PeriodicalId":299459,"journal":{"name":"2010 Data Compression Conference","volume":"86 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122553290","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
TreeZip: A New Algorithm for Compressing Large Collections of Evolutionary Trees TreeZip:一种压缩大型进化树集合的新算法
2010 Data Compression Conference Pub Date : 2010-03-24 DOI: 10.1109/DCC.2010.64
Suzanne J. Matthews, Seung-Jin Sul, T. Williams
{"title":"TreeZip: A New Algorithm for Compressing Large Collections of Evolutionary Trees","authors":"Suzanne J. Matthews, Seung-Jin Sul, T. Williams","doi":"10.1109/DCC.2010.64","DOIUrl":"https://doi.org/10.1109/DCC.2010.64","url":null,"abstract":"Evolutionary trees are family trees that represent the relationships between a group of organisms. Phylogenetic analysis often produce thousands of hypothetical trees that can represent the true phylogeny. These large collections of trees are costly to store. We introduce TreeZip, a novel algorithm designed to losslessly compress phylogenetic trees. The advantage of TreeZip is its ability to uniquely store the shared information among trees and compress the relationships effectively. We evaluate the performance of our approach over fourteen tree collections ranging from 2,505 to 150,000 trees corresponding to 0.6MB to 434MB in storage. Our results demonstrate that TreeZip effectively compresses phylogenetic trees, typically compressing a file to 2% or less of its original size. When coupled with 7zip, TreeZip can compress a file to less than 1% of its original size. On our largest dataset, TreeZip+7zip compressed the input file to .008% of its original size. Our results strongly suggest that TreeZip is an ideal approach for compressing phylogenetic trees.","PeriodicalId":299459,"journal":{"name":"2010 Data Compression Conference","volume":"256 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122708609","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Optimum String Match Choices in LZSS LZSS中最优字符串匹配选择
2010 Data Compression Conference Pub Date : 2010-03-24 DOI: 10.1109/DCC.2010.67
G. Little, J. Diamond
{"title":"Optimum String Match Choices in LZSS","authors":"G. Little, J. Diamond","doi":"10.1109/DCC.2010.67","DOIUrl":"https://doi.org/10.1109/DCC.2010.67","url":null,"abstract":"The LZ77 and LZ78 compression algorithms perform a greedy choice when looking for the next string of input symbols to match. That is, the longest string of symbols which is found in the current dictionary is chosen as the next match. Many variations of LZ77 and LZ78 have been proposed; some of these attempt to improve compression by sometimes choosing a non-maximal string, if it appears that such a choice might improve the overall compression ratio. These approaches make this decision based upon local criteria in an attempt to minimise the number of strings matched. In this paper we present an algorithm which computes a set of matches designed to minimize the number of bits output, not necessarily the number of strings matched.","PeriodicalId":299459,"journal":{"name":"2010 Data Compression Conference","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128216930","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Enhanced Adaptive Interpolation Filters for Video Coding 增强自适应插值滤波器的视频编码
2010 Data Compression Conference Pub Date : 2010-03-24 DOI: 10.1109/DCC.2010.46
Yan Ye, G. Motta, M. Karczewicz
{"title":"Enhanced Adaptive Interpolation Filters for Video Coding","authors":"Yan Ye, G. Motta, M. Karczewicz","doi":"10.1109/DCC.2010.46","DOIUrl":"https://doi.org/10.1109/DCC.2010.46","url":null,"abstract":"H.264/AVC uses motion compensated prediction with fractional-pixel precision to reduce temporal redundancy of the input video signal. It has been shown that the Adaptive Interpolation Filter (AIF) framework [3] can significantly improve accuracy of the motion compensated prediction. In this paper, we present the Enhanced Adaptive Interpolation Filters (E-AIF) scheme, which enhances the AIF framework with a number of useful features, aimed at both improving performance and reducing complexity. These features include the full-pixel position filter and the filter offset, the radial-shaped 12-position filter support, and a RD-based filter selection. Simulations show that E-AIF can achieve up to 20% bit rate reduction compared to H.264/AVC. Compared to all other AIF schemes, E-AIF further reduces the bit rate by up to 6%, and demonstrates the highest performance consistently.","PeriodicalId":299459,"journal":{"name":"2010 Data Compression Conference","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116487894","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 16
Packet Dropping for Widely Varying Bit Reduction Rates Using a Network-Based Packet Loss Visibility Model 基于网络的丢包可见性模型的大范围变比特减少率丢包
2010 Data Compression Conference Pub Date : 2010-03-24 DOI: 10.1109/DCC.2010.47
Ting-Lan Lin, Jihyun Shin, P. Cosman
{"title":"Packet Dropping for Widely Varying Bit Reduction Rates Using a Network-Based Packet Loss Visibility Model","authors":"Ting-Lan Lin, Jihyun Shin, P. Cosman","doi":"10.1109/DCC.2010.47","DOIUrl":"https://doi.org/10.1109/DCC.2010.47","url":null,"abstract":"We propose a packet dropping algorithm for various packet loss rates. A network-based packet loss visibility model is used to evaluate the visual importance of each H.264 packet inside the network. During network congestion, based on the estimated loss visibility of each packet, we drop the least visible frames and/or the least visible packets until the required bit reduction rate is achieved. Based on a computable perceptually-based metric, our algorithm performs better than an existing approach (dropping B packets or frames).","PeriodicalId":299459,"journal":{"name":"2010 Data Compression Conference","volume":"109 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121533047","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 12
Lossless Compression of Maps, Charts, and Graphs via Color Separation 无损压缩地图,图表和图形通过颜色分离
2010 Data Compression Conference Pub Date : 2010-03-24 DOI: 10.1109/DCC.2010.102
S. alZahir, Arber Borici
{"title":"Lossless Compression of Maps, Charts, and Graphs via Color Separation","authors":"S. alZahir, Arber Borici","doi":"10.1109/DCC.2010.102","DOIUrl":"https://doi.org/10.1109/DCC.2010.102","url":null,"abstract":"In this paper, we present a fast lossless compression scheme for digital map images, and chart and graph images in the raster image format. This work contains two main contributions. The first is centered around the creation of a codebook that is based on symbol entropy. The second contribution is the introduction of a new row-column reduction coding algorithm. This scheme determines the number of different colors in the given image and creates a separate bi-level data layer for each color i.e., one for the color and the second is for the background. Then, the bi-level layers are individually compressed using the proposed method, which is based on symbol-entropy in conjunction with our row-column reduction coding algorithm. Our experimental results show that our lossless compression scheme scored a compression equal to 0.035 bpp on average for map images and 0.03 bpp on average for charts and graphs. These results are better than most reported results in the literature. Moreover, our scheme is simple and fast.","PeriodicalId":299459,"journal":{"name":"2010 Data Compression Conference","volume":"69 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126277214","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Dual Contribution of JPEG 2000 Images for Unidirectional Links JPEG 2000图像对单向链接的双重贡献
2010 Data Compression Conference Pub Date : 2010-03-24 DOI: 10.1109/DCC.2010.81
J. M. Barbero, Eugenio Santos, Abraham Gutierrez
{"title":"Dual Contribution of JPEG 2000 Images for Unidirectional Links","authors":"J. M. Barbero, Eugenio Santos, Abraham Gutierrez","doi":"10.1109/DCC.2010.81","DOIUrl":"https://doi.org/10.1109/DCC.2010.81","url":null,"abstract":"The production of Broadcast content generates large files of video and audio content which must be transmitted among different production centers. The contribution of this material in certain circumstances is carried out on satellite links which usually have a relatively high error probability. In addition, the image suffers degradation in the processes of converting to base band video, transcoding to different compression systems, errors in transmission and drops in the link. To overcome this limitation, we present a transmissions system, based in a patent, which ensures the quality of JPEG2000 professional images.","PeriodicalId":299459,"journal":{"name":"2010 Data Compression Conference","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134600487","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信