{"title":"Weighting of double exponential distributed data in lossless image compression","authors":"N. Ekstrand, B. Smeets","doi":"10.1109/DCC.1998.672268","DOIUrl":null,"url":null,"abstract":"Summary form only given. State-of-the-art lossless image compression schemes use a prediction scheme, a context model and an arithmetic encoder. The discrepancy between the predicted value and the actual value is regarded to be double exponentially distributed. The BT/CARPscheme was considered in Weinberger et al. (1996) as a means to find limits in lossless image compression. The scheme uses the context-algorithm (Rissanen 1983) which is, in terms of redundancy, an asymptotically optimal tree-algorithm. Further, BT/CARP uses extended tree nodes which contain a linear prediction scheme and a model for the double exponentially distributed data (DE-data). The model parameters are estimated and from the corresponding distribution the symbol probability distribution can be calculated. The drawback of the parameter estimating technique is its poor performance for short sequences. In order to improve the BT/CARP-scheme we have exchanged the estimation techniques with probability assignment techniques: the CTW-algorithm (Williams et al. 1995) and our weighting method for DE-data. We conclude that the suggested probability assignment technique has a favorable effect on the compression performance when compared with the traditional estimation techniques. On a test-image set the assumed improvement was verified.","PeriodicalId":191890,"journal":{"name":"Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225)","volume":"40 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1998-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DCC.1998.672268","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Summary form only given. State-of-the-art lossless image compression schemes use a prediction scheme, a context model and an arithmetic encoder. The discrepancy between the predicted value and the actual value is regarded to be double exponentially distributed. The BT/CARPscheme was considered in Weinberger et al. (1996) as a means to find limits in lossless image compression. The scheme uses the context-algorithm (Rissanen 1983) which is, in terms of redundancy, an asymptotically optimal tree-algorithm. Further, BT/CARP uses extended tree nodes which contain a linear prediction scheme and a model for the double exponentially distributed data (DE-data). The model parameters are estimated and from the corresponding distribution the symbol probability distribution can be calculated. The drawback of the parameter estimating technique is its poor performance for short sequences. In order to improve the BT/CARP-scheme we have exchanged the estimation techniques with probability assignment techniques: the CTW-algorithm (Williams et al. 1995) and our weighting method for DE-data. We conclude that the suggested probability assignment technique has a favorable effect on the compression performance when compared with the traditional estimation techniques. On a test-image set the assumed improvement was verified.