{"title":"A Logarithmic Decomposition and a Signed Measure Space for Entropy","authors":"Keenan J. A. Down, Pedro A. M. Mediano","doi":"arxiv-2409.03732","DOIUrl":null,"url":null,"abstract":"The Shannon entropy of a random variable X has much behaviour analogous to a\nsigned measure. Previous work has explored this connection by defining a signed\nmeasure on abstract sets, which are taken to represent the information that\ndifferent random variables contain. This construction is sufficient to derive\nmany measure-theoretical counterparts to information quantities such as the\nmutual information $I(X; Y) = \\mu(\\tilde{X} \\cap \\tilde{Y})$, the joint entropy\n$H(X,Y) = \\mu(\\tilde{X} \\cup \\tilde{Y})$, and the conditional entropy $H(X|Y) =\n\\mu(\\tilde{X} \\setminus \\tilde{Y})$. Here we provide concrete characterisations\nof these abstract sets and a corresponding signed measure, and in doing so we\ndemonstrate that there exists a much finer decomposition with intuitive\nproperties which we call the logarithmic decomposition (LD). We show that this\nsigned measure space has the useful property that its logarithmic atoms are\neasily characterised with negative or positive entropy, while also being\nconsistent with Yeung's I-measure. We present the usability of our approach by\nre-examining the G\\'acs-K\\\"orner common information and the Wyner common\ninformation from this new geometric perspective and characterising it in terms\nof our logarithmic atoms - a property we call logarithmic decomposability. We\npresent possible extensions of this construction to continuous probability\ndistributions before discussing implications for quality-led information\ntheory. Lastly, we apply our new decomposition to examine the Dyadic and\nTriadic systems of James and Crutchfield and show that, in contrast to the\nI-measure alone, our decomposition is able to qualitatively distinguish between\nthem.","PeriodicalId":501082,"journal":{"name":"arXiv - MATH - Information Theory","volume":"69 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - MATH - Information Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.03732","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The Shannon entropy of a random variable X has much behaviour analogous to a
signed measure. Previous work has explored this connection by defining a signed
measure on abstract sets, which are taken to represent the information that
different random variables contain. This construction is sufficient to derive
many measure-theoretical counterparts to information quantities such as the
mutual information $I(X; Y) = \mu(\tilde{X} \cap \tilde{Y})$, the joint entropy
$H(X,Y) = \mu(\tilde{X} \cup \tilde{Y})$, and the conditional entropy $H(X|Y) =
\mu(\tilde{X} \setminus \tilde{Y})$. Here we provide concrete characterisations
of these abstract sets and a corresponding signed measure, and in doing so we
demonstrate that there exists a much finer decomposition with intuitive
properties which we call the logarithmic decomposition (LD). We show that this
signed measure space has the useful property that its logarithmic atoms are
easily characterised with negative or positive entropy, while also being
consistent with Yeung's I-measure. We present the usability of our approach by
re-examining the G\'acs-K\"orner common information and the Wyner common
information from this new geometric perspective and characterising it in terms
of our logarithmic atoms - a property we call logarithmic decomposability. We
present possible extensions of this construction to continuous probability
distributions before discussing implications for quality-led information
theory. Lastly, we apply our new decomposition to examine the Dyadic and
Triadic systems of James and Crutchfield and show that, in contrast to the
I-measure alone, our decomposition is able to qualitatively distinguish between
them.