{"title":"Optimal decoding of entropy coded memoryless sources over binary symmetric channels","authors":"K.P. Subbalakshmi, J. Vaisey","doi":"10.1109/DCC.1998.672315","DOIUrl":null,"url":null,"abstract":"Summary form only given. Entropy codes (e.g. Huffman codes) are often used to improve the rate-distortion performance of codecs for most sources. However, transmitting entropy coded sources over noisy channels can cause the encoder and decoder to lose synchronization, because the codes tend to be of variable length. Designing optimal decoders to deal with this problem is nontrivial since it is no longer optimal to process the data in fixed-length blocks, as is done with fixed-length codes. This paper deals with the design of an optimal decoder (MAPD), in the maximum a posteriori (MAP) sense, for an entropy coded memoryless source transmitted over a binary symmetric channel (BSC) with channel cross over probability /spl epsiv/. The MAP problem is cast in a dynamic programming framework and a Viterbi like implementation of the decoder is presented. At each stage the MAPD performs two operations: the metric-update and the merger-check operations. A stream of 40,000 samples of a zero mean, unit variance, Gaussian source, quantized with uniform, N-level quantizers was Huffman encoded and the resulting bit stream was transmitted over a BSC. Experiments were performed for values of N ranging from 128 to 1024 and for four different random error patterns, obtained using a random number generator. The results demonstrate that the MAPD performs better than the HD on an average, whenever /spl epsiv/ is comparable to the source probabilities. A maximum reduction of 2.94% in the bits that are out of synchronization, was achieved for the 1024 level quantizer.","PeriodicalId":191890,"journal":{"name":"Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225)","volume":"26 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1998-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"16","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DCC.1998.672315","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 16
Abstract
Summary form only given. Entropy codes (e.g. Huffman codes) are often used to improve the rate-distortion performance of codecs for most sources. However, transmitting entropy coded sources over noisy channels can cause the encoder and decoder to lose synchronization, because the codes tend to be of variable length. Designing optimal decoders to deal with this problem is nontrivial since it is no longer optimal to process the data in fixed-length blocks, as is done with fixed-length codes. This paper deals with the design of an optimal decoder (MAPD), in the maximum a posteriori (MAP) sense, for an entropy coded memoryless source transmitted over a binary symmetric channel (BSC) with channel cross over probability /spl epsiv/. The MAP problem is cast in a dynamic programming framework and a Viterbi like implementation of the decoder is presented. At each stage the MAPD performs two operations: the metric-update and the merger-check operations. A stream of 40,000 samples of a zero mean, unit variance, Gaussian source, quantized with uniform, N-level quantizers was Huffman encoded and the resulting bit stream was transmitted over a BSC. Experiments were performed for values of N ranging from 128 to 1024 and for four different random error patterns, obtained using a random number generator. The results demonstrate that the MAPD performs better than the HD on an average, whenever /spl epsiv/ is comparable to the source probabilities. A maximum reduction of 2.94% in the bits that are out of synchronization, was achieved for the 1024 level quantizer.