{"title":"Informational divergence approximations to product distributions","authors":"J. Hou, G. Kramer","doi":"10.1109/CWIT.2013.6621596","DOIUrl":"https://doi.org/10.1109/CWIT.2013.6621596","url":null,"abstract":"The minimum rate needed to accurately approximate a product distribution based on an unnormalized informational divergence is shown to be a mutual information. This result subsumes results of Wyner on common information and Han-Verdú on resolvability. The result also extends to cases where the source distribution is unknown but the entropy is known.","PeriodicalId":398936,"journal":{"name":"2013 13th Canadian Workshop on Information Theory","volume":"87 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133536041","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}