Amit Golander, S. Tahar, Lior Glass, G. Biran, Sagi Manole
{"title":"高压缩率和比例使用预定义的霍夫曼字典","authors":"Amit Golander, S. Tahar, Lior Glass, G. Biran, Sagi Manole","doi":"10.1109/DCC.2013.119","DOIUrl":null,"url":null,"abstract":"Current Huffman coding modes are optimal for a single metric: compression ratio (quality) or rate (performance). We recognize that real life data can usually be classified to families of data types and thus the Huffman dictionary can be reused instead of recalculated. In this paper, we show how to balance the trade-off between compression ratio and rate, without modifying existing standards and legacy decompression implementations.","PeriodicalId":388717,"journal":{"name":"2013 Data Compression Conference","volume":"56 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"High Compression Rate and Ratio Using Predefined Huffman Dictionaries\",\"authors\":\"Amit Golander, S. Tahar, Lior Glass, G. Biran, Sagi Manole\",\"doi\":\"10.1109/DCC.2013.119\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Current Huffman coding modes are optimal for a single metric: compression ratio (quality) or rate (performance). We recognize that real life data can usually be classified to families of data types and thus the Huffman dictionary can be reused instead of recalculated. In this paper, we show how to balance the trade-off between compression ratio and rate, without modifying existing standards and legacy decompression implementations.\",\"PeriodicalId\":388717,\"journal\":{\"name\":\"2013 Data Compression Conference\",\"volume\":\"56 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2013-03-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2013 Data Compression Conference\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/DCC.2013.119\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 Data Compression Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DCC.2013.119","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
High Compression Rate and Ratio Using Predefined Huffman Dictionaries
Current Huffman coding modes are optimal for a single metric: compression ratio (quality) or rate (performance). We recognize that real life data can usually be classified to families of data types and thus the Huffman dictionary can be reused instead of recalculated. In this paper, we show how to balance the trade-off between compression ratio and rate, without modifying existing standards and legacy decompression implementations.