D. McLean, J. Agbinya, J. Cannon, Rong-Yu Chao, Youngkyu Choi, Haley Jones
{"title":"Which Entropy Coder?","authors":"D. McLean, J. Agbinya, J. Cannon, Rong-Yu Chao, Youngkyu Choi, Haley Jones","doi":"10.1109/ISSPA.1996.615083","DOIUrl":null,"url":null,"abstract":"We discuss the relative meritS Of the Huffman and the Arithmetic coders. We show that the Huffman coder is generally competitive with the Arithmetic coder. from the original data sequence. This has the obvious advantage that. if the most common composite symbol has a probability greater than 0.j. it will be encoded by a ]-bit symbol in the HC. meaning that the original values will be encoded with I/M bits per sample. A second. more subtle. advantage is that if there are significant correlations between the values to be encoded. grouping them will reduce the entropy of the distribution of symbols. yielding still lower bit rates.","PeriodicalId":359344,"journal":{"name":"Fourth International Symposium on Signal Processing and Its Applications","volume":"47 S226","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Fourth International Symposium on Signal Processing and Its Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISSPA.1996.615083","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
We discuss the relative meritS Of the Huffman and the Arithmetic coders. We show that the Huffman coder is generally competitive with the Arithmetic coder. from the original data sequence. This has the obvious advantage that. if the most common composite symbol has a probability greater than 0.j. it will be encoded by a ]-bit symbol in the HC. meaning that the original values will be encoded with I/M bits per sample. A second. more subtle. advantage is that if there are significant correlations between the values to be encoded. grouping them will reduce the entropy of the distribution of symbols. yielding still lower bit rates.