{"title":"连续时间马尔可夫链曲率维条件下的熵信息不等式","authors":"Frederic Weber","doi":"10.1214/21-EJP627","DOIUrl":null,"url":null,"abstract":"In the setting of reversible continuous-time Markov chains, the $CD_\\Upsilon$ condition has been shown recently to be a consistent analogue to the Bakry-Emery condition in the diffusive setting in terms of proving Li-Yau inequalities under a finite dimension term and proving the modified logarithmic Sobolev inequality under a positive curvature bound. In this article we examine the case where both is given, a finite dimension term and a positive curvature bound. For this purpose we introduce the $CD_\\Upsilon(\\kappa,F)$ condition, where the dimension term is expressed by a so called $CD$-function $F$. We derive functional inequalities relating the entropy to the Fisher information, which we will call entropy-information inequalities. Further, we deduce applications of entropy-information inequalities such as ultracontractivity bounds, exponential integrability of Lipschitz functions, finite diameter bounds and a modified version of the celebrated Nash inequality.","PeriodicalId":8470,"journal":{"name":"arXiv: Probability","volume":"4 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2020-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Entropy-information inequalities under curvature-dimension conditions for continuous-time Markov chains\",\"authors\":\"Frederic Weber\",\"doi\":\"10.1214/21-EJP627\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In the setting of reversible continuous-time Markov chains, the $CD_\\\\Upsilon$ condition has been shown recently to be a consistent analogue to the Bakry-Emery condition in the diffusive setting in terms of proving Li-Yau inequalities under a finite dimension term and proving the modified logarithmic Sobolev inequality under a positive curvature bound. In this article we examine the case where both is given, a finite dimension term and a positive curvature bound. For this purpose we introduce the $CD_\\\\Upsilon(\\\\kappa,F)$ condition, where the dimension term is expressed by a so called $CD$-function $F$. We derive functional inequalities relating the entropy to the Fisher information, which we will call entropy-information inequalities. Further, we deduce applications of entropy-information inequalities such as ultracontractivity bounds, exponential integrability of Lipschitz functions, finite diameter bounds and a modified version of the celebrated Nash inequality.\",\"PeriodicalId\":8470,\"journal\":{\"name\":\"arXiv: Probability\",\"volume\":\"4 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-10-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv: Probability\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1214/21-EJP627\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv: Probability","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1214/21-EJP627","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Entropy-information inequalities under curvature-dimension conditions for continuous-time Markov chains
In the setting of reversible continuous-time Markov chains, the $CD_\Upsilon$ condition has been shown recently to be a consistent analogue to the Bakry-Emery condition in the diffusive setting in terms of proving Li-Yau inequalities under a finite dimension term and proving the modified logarithmic Sobolev inequality under a positive curvature bound. In this article we examine the case where both is given, a finite dimension term and a positive curvature bound. For this purpose we introduce the $CD_\Upsilon(\kappa,F)$ condition, where the dimension term is expressed by a so called $CD$-function $F$. We derive functional inequalities relating the entropy to the Fisher information, which we will call entropy-information inequalities. Further, we deduce applications of entropy-information inequalities such as ultracontractivity bounds, exponential integrability of Lipschitz functions, finite diameter bounds and a modified version of the celebrated Nash inequality.