{"title":"关于条件期望和条件互信息的道不等式的最终形式","authors":"R. Ahlswede","doi":"10.3934/amc.2007.1.239","DOIUrl":null,"url":null,"abstract":"Summary form only given: Recently Terence Tao approached Szemeredi's regularity lemma from the perspectives of probability theory and of information theory instead of graph theory and found a stronger variant of this lemma, which involves a new parameter. To pass from an entropy formulation to an expectation formulation he found the following lemma. Let Y, and X, X' be discrete random variables taking values in y and x, respectively, where y sub [-1, 1], and with X' = f(X) for a (deterministic) function f. Then we have E(|E(Y|X') - E(Y|X)|) les 2I(X nland Y|X')1/2. We show that the constant 2 can be improved to (2ln2)1/2 and that this is the best possible constant","PeriodicalId":115298,"journal":{"name":"2006 IEEE International Symposium on Information Theory","volume":"102 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2006-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"The final form of Tao's inequality relating conditional expectation and conditional mutual information\",\"authors\":\"R. Ahlswede\",\"doi\":\"10.3934/amc.2007.1.239\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Summary form only given: Recently Terence Tao approached Szemeredi's regularity lemma from the perspectives of probability theory and of information theory instead of graph theory and found a stronger variant of this lemma, which involves a new parameter. To pass from an entropy formulation to an expectation formulation he found the following lemma. Let Y, and X, X' be discrete random variables taking values in y and x, respectively, where y sub [-1, 1], and with X' = f(X) for a (deterministic) function f. Then we have E(|E(Y|X') - E(Y|X)|) les 2I(X nland Y|X')1/2. We show that the constant 2 can be improved to (2ln2)1/2 and that this is the best possible constant\",\"PeriodicalId\":115298,\"journal\":{\"name\":\"2006 IEEE International Symposium on Information Theory\",\"volume\":\"102 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2006-07-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2006 IEEE International Symposium on Information Theory\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3934/amc.2007.1.239\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2006 IEEE International Symposium on Information Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3934/amc.2007.1.239","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
The final form of Tao's inequality relating conditional expectation and conditional mutual information
Summary form only given: Recently Terence Tao approached Szemeredi's regularity lemma from the perspectives of probability theory and of information theory instead of graph theory and found a stronger variant of this lemma, which involves a new parameter. To pass from an entropy formulation to an expectation formulation he found the following lemma. Let Y, and X, X' be discrete random variables taking values in y and x, respectively, where y sub [-1, 1], and with X' = f(X) for a (deterministic) function f. Then we have E(|E(Y|X') - E(Y|X)|) les 2I(X nland Y|X')1/2. We show that the constant 2 can be improved to (2ln2)1/2 and that this is the best possible constant