{"title":"改进了一个版本的Lempel-Ziv算法的冗余","authors":"A. Wyner, A. Wyner","doi":"10.1109/ISIT.1994.394962","DOIUrl":null,"url":null,"abstract":"The fixed-database Lempel-Ziv algorithm (FDLZ) closely resembles practical versions of the LZ algorithm that are widely in use. Bender and Wolf (1991) suggested a variant of LZ which empirically appears to perform well. We suggest a finite memory version of their scheme, and show that it has redundancy /spl rho//sub n/=O(1/log n) where n is the memory size. We are concerned with a data source which is a stationary, finite-memory random sequence that takes values in an alphabet of finite size A. The data source can be losslessly encoded using (H+/spl rho//sub n/) bits per source symbol, where n is a measure of the complexity of the code, and /spl rho//sub n//spl rarr/0, as n/spl rarr//spl infin/. The LZ algorithm is a universal procedure (which does not depend on the source statistics) for encoding the source at a rate close to the entropy.<<ETX>>","PeriodicalId":331390,"journal":{"name":"Proceedings of 1994 IEEE International Symposium on Information Theory","volume":"19 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1994-06-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"42","resultStr":"{\"title\":\"Improved redundancy of a version of the Lempel-Ziv algorithm\",\"authors\":\"A. Wyner, A. Wyner\",\"doi\":\"10.1109/ISIT.1994.394962\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The fixed-database Lempel-Ziv algorithm (FDLZ) closely resembles practical versions of the LZ algorithm that are widely in use. Bender and Wolf (1991) suggested a variant of LZ which empirically appears to perform well. We suggest a finite memory version of their scheme, and show that it has redundancy /spl rho//sub n/=O(1/log n) where n is the memory size. We are concerned with a data source which is a stationary, finite-memory random sequence that takes values in an alphabet of finite size A. The data source can be losslessly encoded using (H+/spl rho//sub n/) bits per source symbol, where n is a measure of the complexity of the code, and /spl rho//sub n//spl rarr/0, as n/spl rarr//spl infin/. The LZ algorithm is a universal procedure (which does not depend on the source statistics) for encoding the source at a rate close to the entropy.<<ETX>>\",\"PeriodicalId\":331390,\"journal\":{\"name\":\"Proceedings of 1994 IEEE International Symposium on Information Theory\",\"volume\":\"19 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1994-06-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"42\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of 1994 IEEE International Symposium on Information Theory\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISIT.1994.394962\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of 1994 IEEE International Symposium on Information Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISIT.1994.394962","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Improved redundancy of a version of the Lempel-Ziv algorithm
The fixed-database Lempel-Ziv algorithm (FDLZ) closely resembles practical versions of the LZ algorithm that are widely in use. Bender and Wolf (1991) suggested a variant of LZ which empirically appears to perform well. We suggest a finite memory version of their scheme, and show that it has redundancy /spl rho//sub n/=O(1/log n) where n is the memory size. We are concerned with a data source which is a stationary, finite-memory random sequence that takes values in an alphabet of finite size A. The data source can be losslessly encoded using (H+/spl rho//sub n/) bits per source symbol, where n is a measure of the complexity of the code, and /spl rho//sub n//spl rarr/0, as n/spl rarr//spl infin/. The LZ algorithm is a universal procedure (which does not depend on the source statistics) for encoding the source at a rate close to the entropy.<>