Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225)最新文献

筛选
英文 中文
Mail servers with embedded data compression mechanisms 具有嵌入式数据压缩机制的邮件服务器
Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225) Pub Date : 1998-03-30 DOI: 10.1109/DCC.1998.672308
A. Nand, T. Yu
{"title":"Mail servers with embedded data compression mechanisms","authors":"A. Nand, T. Yu","doi":"10.1109/DCC.1998.672308","DOIUrl":"https://doi.org/10.1109/DCC.1998.672308","url":null,"abstract":"Summary form only given. Typically, e-mail messages are moved across the Internet using the Simple Mail Transfer protocol (SMTP) which utilizes the connection-oriented Transmission Control Protocol (TCP) to establish connections between two mail servers. The POP3 (Post Office Protocol) is used to retrieve the mail for individual users from a server. We designed and implemented e-mail servers that contain embedded data compression mechanisms; the SMTP protocol is extended to allow for the mail client and server to negotiate compression which is transparent to the users and the new servers are backward-compatible with traditional mail servers. The LZSS compression algorithm is used to carry out the data compression. Different kinds of mail data were used to test the e-mail system. Textural data, binary data, and graphical data were transported across the Internet using the designed e-mail system. Several Windows NT hosts were identified for this experiment. These hosts were connected with the Internet.","PeriodicalId":191890,"journal":{"name":"Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225)","volume":"50 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1998-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132671073","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Intensity controlled motion compensation 强度控制运动补偿
Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225) Pub Date : 1998-03-30 DOI: 10.1109/DCC.1998.672153
J. Kari, Mihai Gavrilescu
{"title":"Intensity controlled motion compensation","authors":"J. Kari, Mihai Gavrilescu","doi":"10.1109/DCC.1998.672153","DOIUrl":"https://doi.org/10.1109/DCC.1998.672153","url":null,"abstract":"A new motion compensation technique that allows more than one motion vector inside each block is introduced. The technique uses the intensity information to determine which motion vector to apply at any given pixel. An efficient motion estimation algorithm is described that finds near optimal selections of motion vectors. The simulation results show a significant improvement in the prediction accuracy over the traditional one motion vector per block model.","PeriodicalId":191890,"journal":{"name":"Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225)","volume":"263 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1998-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132752100","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Compression via guided parsing 通过引导解析进行压缩
Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225) Pub Date : 1998-03-30 DOI: 10.1109/DCC.1998.672269
William S. Evans
{"title":"Compression via guided parsing","authors":"William S. Evans","doi":"10.1109/DCC.1998.672269","DOIUrl":"https://doi.org/10.1109/DCC.1998.672269","url":null,"abstract":"Summary form only given. The reduction in storage size achieved by compressing a file translates directly into a reduction in transmission time when communicating the file. An increasingly common form of transmitted data is a computer program description. This paper examines the compression of source code, the high-level language representation of a program, using the language's context free grammar. We call the general technique guided parsing since it is a compression scheme based on predicting the behavior of a parser when it parses the source code and guiding its behavior by encoding its next action based on this prediction. In this paper, we describe the implementation and results of two very different forms of guided parsing. One is based on bottom-up parsing while the other is a top-down approach.","PeriodicalId":191890,"journal":{"name":"Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225)","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1998-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131307576","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
On suboptimal multidimensional companding 关于次优多维扩展
Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225) Pub Date : 1998-03-30 DOI: 10.1109/DCC.1998.672191
S. Simon
{"title":"On suboptimal multidimensional companding","authors":"S. Simon","doi":"10.1109/DCC.1998.672191","DOIUrl":"https://doi.org/10.1109/DCC.1998.672191","url":null,"abstract":"A vector quantizer (VQ) consisting of a nonlinear mapping (compressor), a lattice VQ, and the inverse of the compressor (expander) is considered. While it was previously pointed out that in dimensions k>2 except for linear transformations and translations only reflections through reciprocal radii can preserve optimality in terms of the lattice cells' normalized second moments, we consider the suboptimal case and provide a method to determine the loss introduced by companding. Using a spherically symmetric compander as an example, it is demonstrated that the loss can be kept very small in practical situations, especially when large VQ dimensions are chosen.","PeriodicalId":191890,"journal":{"name":"Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225)","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1998-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115392123","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
Non-uniform PPM and context tree models 非统一PPM和上下文树模型
Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225) Pub Date : 1998-03-30 DOI: 10.1109/DCC.1998.672156
J. Åberg, Y. Shtarkov, B. Smeets
{"title":"Non-uniform PPM and context tree models","authors":"J. Åberg, Y. Shtarkov, B. Smeets","doi":"10.1109/DCC.1998.672156","DOIUrl":"https://doi.org/10.1109/DCC.1998.672156","url":null,"abstract":"The problem of optimizing PPM with the help of different choices of estimators and their parameters for different subsets of nodes in the context tree is considered. Methods of such optimization for Markov chain and context tree models for individual files and over given sets of files are presented, and it is demonstrated that the extension from Markov chain models to context tree models is necessary to receive significant improvements of the compression ratio.","PeriodicalId":191890,"journal":{"name":"Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225)","volume":"59 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1998-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128364032","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Compression by model combination 模型组合压缩
Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225) Pub Date : 1998-03-30 DOI: 10.1109/DCC.1998.672160
Tong Zhang
{"title":"Compression by model combination","authors":"Tong Zhang","doi":"10.1109/DCC.1998.672160","DOIUrl":"https://doi.org/10.1109/DCC.1998.672160","url":null,"abstract":"In the probabilistic framework for data compression, a model of the probability distribution of a data source is constructed, and the predicted probability is entropy coded. To achieve better compression, most traditional methods resort to higher order models. However, this approach is limited by memory and often suffers from the context dilution problem. In this paper, we present methods that allow us to combine a few low order models to achieve equivalent or better compression of a high order model. We show that when applying our techniques to bi-level images, we are able to achieve the state of the art compression within the probabilistic framework.","PeriodicalId":191890,"journal":{"name":"Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225)","volume":"141 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1998-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132261972","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Conditional source coding with competitive lists 带有竞争列表的条件源代码
Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225) Pub Date : 1998-03-30 DOI: 10.1109/DCC.1998.672313
J. Sayir
{"title":"Conditional source coding with competitive lists","authors":"J. Sayir","doi":"10.1109/DCC.1998.672313","DOIUrl":"https://doi.org/10.1109/DCC.1998.672313","url":null,"abstract":"Summary form only given. A new lossless source coding algorithm was developed that achieves a compression ratio slightly better than the Lempel-Ziv-Welch algorithm, but requires as little as 250 kBytes of storage. The algorithm is based on the context-tree approach, encoding one input symbol at a time. Thus, its throughput lies in a range comparable to the PPM algorithm. The very low memory requirement is achieved by eliminating the costly probability estimation commonly performed at every context in context-tree algorithms. The algorithm uses a competitive list at every context. The competitive list is an invertible device that converts the output stream of an unknown discrete memoryless source into a stream of integers whose first-order probability distribution is monotone. The output of all the lists is encoded using a single arithmetic encoder.","PeriodicalId":191890,"journal":{"name":"Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225)","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1998-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125422375","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Adjustments for JPEG de-quantization coefficients
Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225) Pub Date : 1998-03-30 DOI: 10.1109/DCC.1998.672297
G. Lakhani
{"title":"Adjustments for JPEG de-quantization coefficients","authors":"G. Lakhani","doi":"10.1109/DCC.1998.672297","DOIUrl":"https://doi.org/10.1109/DCC.1998.672297","url":null,"abstract":"Summary form only given. In JPEG baseline compression algorithm, the quantization loss to DCT coefficients can be reduced, if we make use of the observation that the distributions of the DCT coefficients peak at zero and decrease exponentially. It means that the mid-point of a quantization interval, say m, used by the JPEG decoder to restore all coefficients falling within the interval, may be replaced by another point, say y, closer to zero but within the interval. If we model the distributions by /spl lambda/e/sup -/spl lambda/|x|/, where /spl lambda/>0 is a constant, derivable from some statistical parameters such as mean or variance, and we assume that the adjustment q=|m-y| should be chosen so that the sum of the loss to all coefficients falling within a quantization interval is zero for each interval, we can derive q=Q(e/sup /spl lambda/(Q-1)/+(Q-2)/2)/2e/sup /spl lambda/(Q-1)/-1)-1//spl lambda/ where Q is the quantizer step size. To test usefulness of the above idea, we implemented both approaches: (1) JPEG encoder computes /spl lambda/ for each DCT distribution and passes it as part of coded data to the decoder, and (2) JPEG decoder computes /spl lambda/ from the quantized DCT coefficient incrementally as it decodes its input. Through experiments, we found that none of these approaches resulted in much improvements, but found a better approach (OUR) which does not require any modeling of DCT. It uses /spl Sigma/(|m-y|*C)//spl Sigma/C to compute adjustments, where C is the number of coefficients falling within an interval, and the /spl Sigma/ is taken over all intervals not-containing the zero DCT. We also implemented the formulation developed by Ahumada et. al (see SID Digest, 1994) to compare it with the results of OUR approach. The comparison is shown in terms of the % reduction in the RMSE of the images.","PeriodicalId":191890,"journal":{"name":"Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225)","volume":"116 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1998-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123156899","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Robust wavelet zerotree image compression with fixed-length packetization 固定长度分组鲁棒小波零树图像压缩
Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225) Pub Date : 1998-03-30 DOI: 10.1109/DCC.1998.672184
J. K. Rogers, P. Cosman
{"title":"Robust wavelet zerotree image compression with fixed-length packetization","authors":"J. K. Rogers, P. Cosman","doi":"10.1109/DCC.1998.672184","DOIUrl":"https://doi.org/10.1109/DCC.1998.672184","url":null,"abstract":"We present a novel robust image compression algorithm in which the output of a wavelet zerotree-style coder is manipulated into fixed-length segments. The segments are independently decodable, and errors occurring in one segment do not propagate into any other. The method provides both excellent compression performance and graceful degradation under increasing packet losses. We extend the basic scheme to perform region-based compression, in which specified portions of the image are coded to higher quality with little or no side information required by the decoder.","PeriodicalId":191890,"journal":{"name":"Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225)","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1998-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123771789","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 77
Bayesian state combining for context models 上下文模型的贝叶斯状态组合
Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225) Pub Date : 1998-03-30 DOI: 10.1109/DCC.1998.672161
S. Bunton
{"title":"Bayesian state combining for context models","authors":"S. Bunton","doi":"10.1109/DCC.1998.672161","DOIUrl":"https://doi.org/10.1109/DCC.1998.672161","url":null,"abstract":"The best-performing on-line methods for estimating probabilities of symbols in a sequence (required for computing minimal codes) use context trees with either information-theoretic state selection or context-tree weighting. This paper derives de novo from Bayes' theorem, a novel technique for modeling sequences on-line with context trees, which we call \"Bayesian state combining\" or BSC. BSC is comparable in function to both information-theoretic state selection and context-tree weighting. However, it is a truly distinct alternative to either of these techniques, which like BSC, can be viewed as \"dispatchers\" of probability estimates from the set of competing, memoryless models represented by the context tree. The resulting technique handles sequences over m-ary input alphabets for arbitrary m and may employ any probability estimator applicable to context models (e.g., Laplace, Krichevsky-Trofimov, blending, and more generally, mixtures). In experiments that control other (256-ary) context-tree model features such as Markov order and probability estimators, we compare the performance of BSC and information-theoretic state selection. The background notation and concepts are reviewed, as required to understand the modeling problem and application of our result. The leading notion of the paper is derived, which dynamically maps certain states in context-models to a set of mutually exclusive hypotheses and their prior and posterior probabilities. The efficient sequential computation of the posterior probabilities of the hypotheses, which was made possible via a non-obvious application of the percolating description-length update mechanism introduced by Bunton (see Proceedings Data Compression Conference, IEEE Computer Society Press, 1997) is described. The preliminary empirical performance of the technique on the Calgary Corpus is presented, the relationship of BSC to information-theoretic state selection and context-tree weighting is discussed.","PeriodicalId":191890,"journal":{"name":"Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225)","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1998-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122916158","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信