{"title":"The confidence interval of entropy estimation through a noisy channel","authors":"Siu-Wai Ho, T. Chan, A. Grant","doi":"10.1109/CIG.2010.5592695","DOIUrl":null,"url":null,"abstract":"Suppose a stationary memoryless source is observed through a discrete memoryless channel. Determining analytical confidence intervals on the source entropy is known to be a difficult problem, even when the observation channel is noiseless. In this paper, we determine confidence intervals for estimation of source entropy over discrete memoryless channels with invertible transition matrices. A lower bound is given for the minimum number of samples required to guarantee a desired confidence interval. All these results do not require any prior knowledge of the source distribution, other than the alphabet size. When the alphabet size is countably infinite or unknown, we illustrate an inherent difficulty in estimating the source entropy.","PeriodicalId":354925,"journal":{"name":"2010 IEEE Information Theory Workshop","volume":"75 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 IEEE Information Theory Workshop","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CIG.2010.5592695","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7
Abstract
Suppose a stationary memoryless source is observed through a discrete memoryless channel. Determining analytical confidence intervals on the source entropy is known to be a difficult problem, even when the observation channel is noiseless. In this paper, we determine confidence intervals for estimation of source entropy over discrete memoryless channels with invertible transition matrices. A lower bound is given for the minimum number of samples required to guarantee a desired confidence interval. All these results do not require any prior knowledge of the source distribution, other than the alphabet size. When the alphabet size is countably infinite or unknown, we illustrate an inherent difficulty in estimating the source entropy.