{"title":"熵不等式的机器证明","authors":"R. Yeung, Cheuk Ting Li","doi":"10.1109/mbits.2021.3123197","DOIUrl":null,"url":null,"abstract":"The entropy function plays a central role in information theory. Constraints on the entropy function in the form of inequalities, viz. entropy inequalities (often conditional on certain Markov conditions imposed by the problem under consideration), are indispensable tools for proving converse coding theorems. In this expository article, we give an overview of this fundamental subject. After presenting a geometrical framework for the entropy function, we explain how an entropy inequality can be formulated, with or without constraints on the entropy function. Among all entropy inequalities, Shannon-type inequalities, namely those implied by the nonnegativity of Shannon’s information measures, are best understood. The main focus of this article is the verification of Shannon-type inequalities, which in fact can be formulated as a linear programming problem. ITIP, a software package developed for this purpose, as well as two of its variants, AITIP and PSITIP, are discussed. This article ends with a discussion on the hardness of verifying entropy inequalities in general.","PeriodicalId":448036,"journal":{"name":"IEEE BITS the Information Theory Magazine","volume":"16 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":"{\"title\":\"Machine-Proving of Entropy Inequalities\",\"authors\":\"R. Yeung, Cheuk Ting Li\",\"doi\":\"10.1109/mbits.2021.3123197\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The entropy function plays a central role in information theory. Constraints on the entropy function in the form of inequalities, viz. entropy inequalities (often conditional on certain Markov conditions imposed by the problem under consideration), are indispensable tools for proving converse coding theorems. In this expository article, we give an overview of this fundamental subject. After presenting a geometrical framework for the entropy function, we explain how an entropy inequality can be formulated, with or without constraints on the entropy function. Among all entropy inequalities, Shannon-type inequalities, namely those implied by the nonnegativity of Shannon’s information measures, are best understood. The main focus of this article is the verification of Shannon-type inequalities, which in fact can be formulated as a linear programming problem. ITIP, a software package developed for this purpose, as well as two of its variants, AITIP and PSITIP, are discussed. This article ends with a discussion on the hardness of verifying entropy inequalities in general.\",\"PeriodicalId\":448036,\"journal\":{\"name\":\"IEEE BITS the Information Theory Magazine\",\"volume\":\"16 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-08-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"7\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE BITS the Information Theory Magazine\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/mbits.2021.3123197\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE BITS the Information Theory Magazine","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/mbits.2021.3123197","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
The entropy function plays a central role in information theory. Constraints on the entropy function in the form of inequalities, viz. entropy inequalities (often conditional on certain Markov conditions imposed by the problem under consideration), are indispensable tools for proving converse coding theorems. In this expository article, we give an overview of this fundamental subject. After presenting a geometrical framework for the entropy function, we explain how an entropy inequality can be formulated, with or without constraints on the entropy function. Among all entropy inequalities, Shannon-type inequalities, namely those implied by the nonnegativity of Shannon’s information measures, are best understood. The main focus of this article is the verification of Shannon-type inequalities, which in fact can be formulated as a linear programming problem. ITIP, a software package developed for this purpose, as well as two of its variants, AITIP and PSITIP, are discussed. This article ends with a discussion on the hardness of verifying entropy inequalities in general.