熵的对数分解和有符号度量空间

Keenan J. A. Down, Pedro A. M. Mediano
{"title":"熵的对数分解和有符号度量空间","authors":"Keenan J. A. Down, Pedro A. M. Mediano","doi":"arxiv-2409.03732","DOIUrl":null,"url":null,"abstract":"The Shannon entropy of a random variable X has much behaviour analogous to a\nsigned measure. Previous work has explored this connection by defining a signed\nmeasure on abstract sets, which are taken to represent the information that\ndifferent random variables contain. This construction is sufficient to derive\nmany measure-theoretical counterparts to information quantities such as the\nmutual information $I(X; Y) = \\mu(\\tilde{X} \\cap \\tilde{Y})$, the joint entropy\n$H(X,Y) = \\mu(\\tilde{X} \\cup \\tilde{Y})$, and the conditional entropy $H(X|Y) =\n\\mu(\\tilde{X} \\setminus \\tilde{Y})$. Here we provide concrete characterisations\nof these abstract sets and a corresponding signed measure, and in doing so we\ndemonstrate that there exists a much finer decomposition with intuitive\nproperties which we call the logarithmic decomposition (LD). We show that this\nsigned measure space has the useful property that its logarithmic atoms are\neasily characterised with negative or positive entropy, while also being\nconsistent with Yeung's I-measure. We present the usability of our approach by\nre-examining the G\\'acs-K\\\"orner common information and the Wyner common\ninformation from this new geometric perspective and characterising it in terms\nof our logarithmic atoms - a property we call logarithmic decomposability. We\npresent possible extensions of this construction to continuous probability\ndistributions before discussing implications for quality-led information\ntheory. Lastly, we apply our new decomposition to examine the Dyadic and\nTriadic systems of James and Crutchfield and show that, in contrast to the\nI-measure alone, our decomposition is able to qualitatively distinguish between\nthem.","PeriodicalId":501082,"journal":{"name":"arXiv - MATH - Information Theory","volume":"69 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Logarithmic Decomposition and a Signed Measure Space for Entropy\",\"authors\":\"Keenan J. A. Down, Pedro A. M. Mediano\",\"doi\":\"arxiv-2409.03732\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The Shannon entropy of a random variable X has much behaviour analogous to a\\nsigned measure. Previous work has explored this connection by defining a signed\\nmeasure on abstract sets, which are taken to represent the information that\\ndifferent random variables contain. This construction is sufficient to derive\\nmany measure-theoretical counterparts to information quantities such as the\\nmutual information $I(X; Y) = \\\\mu(\\\\tilde{X} \\\\cap \\\\tilde{Y})$, the joint entropy\\n$H(X,Y) = \\\\mu(\\\\tilde{X} \\\\cup \\\\tilde{Y})$, and the conditional entropy $H(X|Y) =\\n\\\\mu(\\\\tilde{X} \\\\setminus \\\\tilde{Y})$. Here we provide concrete characterisations\\nof these abstract sets and a corresponding signed measure, and in doing so we\\ndemonstrate that there exists a much finer decomposition with intuitive\\nproperties which we call the logarithmic decomposition (LD). We show that this\\nsigned measure space has the useful property that its logarithmic atoms are\\neasily characterised with negative or positive entropy, while also being\\nconsistent with Yeung's I-measure. We present the usability of our approach by\\nre-examining the G\\\\'acs-K\\\\\\\"orner common information and the Wyner common\\ninformation from this new geometric perspective and characterising it in terms\\nof our logarithmic atoms - a property we call logarithmic decomposability. We\\npresent possible extensions of this construction to continuous probability\\ndistributions before discussing implications for quality-led information\\ntheory. Lastly, we apply our new decomposition to examine the Dyadic and\\nTriadic systems of James and Crutchfield and show that, in contrast to the\\nI-measure alone, our decomposition is able to qualitatively distinguish between\\nthem.\",\"PeriodicalId\":501082,\"journal\":{\"name\":\"arXiv - MATH - Information Theory\",\"volume\":\"69 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - MATH - Information Theory\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.03732\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - MATH - Information Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.03732","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

随机变量 X 的香农熵(Shannon entropy)在很大程度上类似于有符号量。以前的研究通过定义抽象集合上的有符号度量来探索这种联系,抽象集合被认为代表了不同随机变量所包含的信息。这种构造足以推导出许多与信息量相对应的度量理论,如相互信息 $I(X. Y) = \mu(X.Y);Y)=\mu(\tilde{X}\cap\tilde{Y})$,联合熵$H(X,Y)=\mu(\tilde{X}\cup\tilde{Y})$,以及条件熵$H(X|Y)=\mu(\tilde{X}\setminus\tilde{Y})$。在这里,我们提供了这些抽象集合的具体特征和相应的有符号度量,并以此证明存在一种具有直观特性的更精细的分解,我们称之为对数分解(LD)。我们证明,这种有符号度量空间具有一个有用的特性,即其对数原子很容易用负熵或正熵来表征,同时也与杨氏 I 度量一致。我们从这一新的几何视角重新检验了 G\'acs-K\"orner 公共信息和 Wyner 公共信息,并用我们的对数原子对其进行了表征,从而展示了我们方法的可用性--我们称这一属性为对数可分解性。在讨论对质量主导信息论的影响之前,我们介绍了将这一构造扩展到连续概率分布的可能性。最后,我们将新的分解方法应用于研究詹姆斯和克拉奇菲尔德的二元系统和三元系统,结果表明,与单独的I度量不同,我们的分解方法能够从质量上区分它们。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A Logarithmic Decomposition and a Signed Measure Space for Entropy
The Shannon entropy of a random variable X has much behaviour analogous to a signed measure. Previous work has explored this connection by defining a signed measure on abstract sets, which are taken to represent the information that different random variables contain. This construction is sufficient to derive many measure-theoretical counterparts to information quantities such as the mutual information $I(X; Y) = \mu(\tilde{X} \cap \tilde{Y})$, the joint entropy $H(X,Y) = \mu(\tilde{X} \cup \tilde{Y})$, and the conditional entropy $H(X|Y) = \mu(\tilde{X} \setminus \tilde{Y})$. Here we provide concrete characterisations of these abstract sets and a corresponding signed measure, and in doing so we demonstrate that there exists a much finer decomposition with intuitive properties which we call the logarithmic decomposition (LD). We show that this signed measure space has the useful property that its logarithmic atoms are easily characterised with negative or positive entropy, while also being consistent with Yeung's I-measure. We present the usability of our approach by re-examining the G\'acs-K\"orner common information and the Wyner common information from this new geometric perspective and characterising it in terms of our logarithmic atoms - a property we call logarithmic decomposability. We present possible extensions of this construction to continuous probability distributions before discussing implications for quality-led information theory. Lastly, we apply our new decomposition to examine the Dyadic and Triadic systems of James and Crutchfield and show that, in contrast to the I-measure alone, our decomposition is able to qualitatively distinguish between them.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信