{"title":"Causal coding of individual sequences and the Lempel-Ziv differential entropy","authors":"T. Linder, R. Zamir","doi":"10.1109/ISIT.2004.1365597","DOIUrl":null,"url":null,"abstract":"In causal source coding, the reconstruction is restricted to be a function of the present and past source samples, while the variable-length code stream may be noncausal. Neuhoff and Gilbert [1982] showed that for memoryless sources, optimum performance among all causal lossy source codes is achieved by time-sharing at most two memoryless codes (scalar quantizers) followed by entropy coding. We extend this result to causal coding of individual sequences in the limit of small distortion. The optimum performance of finite-memory variable-rate causal codes in this setting is characterized by a deterministic analogue of differential entropy, which we call \"Lempel-Ziv differential entropy.\" As a by-product, we also provide an individual-sequence version of the Shannon lower bound to the rate-distortion function.","PeriodicalId":269907,"journal":{"name":"International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings.","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2004-06-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISIT.2004.1365597","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In causal source coding, the reconstruction is restricted to be a function of the present and past source samples, while the variable-length code stream may be noncausal. Neuhoff and Gilbert [1982] showed that for memoryless sources, optimum performance among all causal lossy source codes is achieved by time-sharing at most two memoryless codes (scalar quantizers) followed by entropy coding. We extend this result to causal coding of individual sequences in the limit of small distortion. The optimum performance of finite-memory variable-rate causal codes in this setting is characterized by a deterministic analogue of differential entropy, which we call "Lempel-Ziv differential entropy." As a by-product, we also provide an individual-sequence version of the Shannon lower bound to the rate-distortion function.