{"title":"Video Compressed Sensing with Multihypothesis","authors":"Eric W. Tramel, J. Fowler","doi":"10.1109/DCC.2011.26","DOIUrl":"https://doi.org/10.1109/DCC.2011.26","url":null,"abstract":"The compressed-sensing recovery of video sequences driven by multihypothesis predictions is considered. Specifically, multihypothesis predictions of the current frame are used to generate a residual in the domain of the compressed-sensing random projections. This residual being typically more compressible than the original frame leads to improved reconstruction quality. To appropriately weight the hypothesis predictions, a Tikhonov regularization to an ill-posed least-squares optimization is proposed. This method is shown to outperform both recovery of the frame independently of the others as well as recovery based on single-hypothesis prediction.","PeriodicalId":328510,"journal":{"name":"2011 Data Compression Conference","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129875904","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Lifting Transforms on Graphs for Video Coding","authors":"Eduardo Martínez-Enríquez, Antonio Ortega","doi":"10.1109/DCC.2011.15","DOIUrl":"https://doi.org/10.1109/DCC.2011.15","url":null,"abstract":"We present a new graph-based transform for video signals using wavelet lifting. Graphs are created to capture spatial and temporal correlations in video sequences. Our new transforms allow spatial and temporal correlation to be jointly exploited, in contrast to existing techniques, such as motion compensated temporal filtering, which can be seen as \"separable\" transforms, since spatial and temporal filtering are performed separately. We design efficient ways to form the graphs and to design the prediction and update filters for different levels of the lifting transform as a function of expected degree of correlation between pixels. Our initial results are promising, with improvements in performance as compared to existing methods in terms of PSNR as a function of the percentage of retained coefficients of the transform.","PeriodicalId":328510,"journal":{"name":"2011 Data Compression Conference","volume":"16 8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126130190","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Explicit Network-Adaptive Robust Multiple Description Coding","authors":"Meng Yang, Xuguang Lan, Nanning Zheng","doi":"10.1109/DCC.2011.88","DOIUrl":"https://doi.org/10.1109/DCC.2011.88","url":null,"abstract":"The data delivery performance of multiple description coding (MDC) over unreliable network with capacity constraints is always related with three factors: redundancy rate, packet loss rate (PLR), bit error rate (BER). It is supposed that the packet losses have been settled by MDC with certain redundancy. We simplified the network-adaptive data delivery problem to only relating with the redundancy rate factor, because the bit-error resilience of the system can be self-adaptively guaranteed well enough for any redundancy case by the proposed codeword ordering method. Then the problem becomes explicit, and is easily and precisely solved by the proposed iterative redundancy control method. The proposed scheme is an extension of scalar quantization (SQ) based MDC, either balanced or unbalanced case. The related index assignment (IA) problem is well solved, and the R-D bound is proved trending to be optimal. This method can be incorporated into any SQ-based MDC system to simplify the network-adaptive delivery problem, fully considering the related factors.","PeriodicalId":328510,"journal":{"name":"2011 Data Compression Conference","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131321973","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Dispersion of Lossy Source Coding","authors":"A. Ingber, Y. Kochman","doi":"10.1109/DCC.2011.13","DOIUrl":"https://doi.org/10.1109/DCC.2011.13","url":null,"abstract":"In this work we investigate the behavior of the minimal rate needed in order to guarantee a given probability that the distortion exceeds a prescribed threshold, at some fixed finite quantization block length. We show that the excess coding rate above the rate-distortion function is inversely proportional (to the first order) to the square root of the block length. We give an explicit expression for the proportion constant, which is given by the inverse Q-function of the allowed excess distortion probability, times the square root of a constant, termed the excess distortion dispersion. This result is the dual of a corresponding channel coding result, where the dispersion above is the dual of the channel dispersion. The work treats discrete memoryless sources, as well as the quadratic-Gaussian case.","PeriodicalId":328510,"journal":{"name":"2011 Data Compression Conference","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-02-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133489315","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Improving PPM Algorithm Using Dictionaries","authors":"Yichuan Hu, Jianzhong Zhang, Farooq Khan, Ying Li","doi":"10.1109/DCC.2011.63","DOIUrl":"https://doi.org/10.1109/DCC.2011.63","url":null,"abstract":"We propose a method to improve traditional character-based PPM text compression algorithm for natural languages. Consider a text file as a sequence of alternating words and non-words, the basic idea of our algorithm is to encode non words and prefixes of words using character-based context models and encode suffixes of words using dictionary models. By using dictionary models, the algorithm can encode multiple characters as a whole, and thus enhance the compression efficiency. The advantages of the proposed algorithm are: 1) it does not require any text preprocessing; 2) it does not need any explicit codeword to identify switch between context and dictionary models; 3) it can be applied to any character-based PPM algorithms without incurring much additional computational cost. Details about the algorithm are described below.","PeriodicalId":328510,"journal":{"name":"2011 Data Compression Conference","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125268871","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Matching Dyadic Distributions to Channels","authors":"Georg Böcherer, R. Mathar","doi":"10.1109/DCC.2011.10","DOIUrl":"https://doi.org/10.1109/DCC.2011.10","url":null,"abstract":"Many communication channels with discrete input have non-uniform capacity achieving probability mass functions (PMF). By parsing a stream of independent and equiprobable bits according to a full prefix-free code, a modulator can generate dyadic PMFs at the channel input. In this work, we show that for discrete memoryless channels and for memoryless discrete noiseless channels, searching for good dyadic input PMFs is equivalent to minimizing the Kullback-Leibler distance between a dyadic PMF and a weighted version of the capacity achieving PMF. We define a new algorithm called Geometric Huffman Coding (GHC) and prove that GHC finds the optimal dyadic PMF in O(m log m) steps where m is the number of input symbols of the considered channel. Furthermore, we prove that by generating dyadic PMFs of blocks of consecutive input symbols, GHC achieves capacity when the block length goes to infinity.","PeriodicalId":328510,"journal":{"name":"2011 Data Compression Conference","volume":"87 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116979093","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Tree Structure Compression with RePair","authors":"Markus Lohrey, S. Maneth, Roy Mennicke","doi":"10.1109/DCC.2011.42","DOIUrl":"https://doi.org/10.1109/DCC.2011.42","url":null,"abstract":"Larsson and Moffat's RePair algorithm is generalized from strings to trees. The new algorithm (TreeRePair) produces straight-line linear context-free tree (SLT) grammars which are smaller than those produced by previous grammar-based compressors such as BPLEX. Experiments show that a Huffman-based coding of the resulting grammars gives compression ratios comparable to the best known XML file compressors. Moreover, SLT grammars can be used as efficient memory representation of trees. Our investigations show that tree traversals over TreeRePair grammars are 14 times slower than over pointer structures and 5 times slower than over succinct trees, while memory consumption is only 1/43 and 1/6, respectively.","PeriodicalId":328510,"journal":{"name":"2011 Data Compression Conference","volume":"46 45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-07-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131055001","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}