{"title":"公共信息维度","authors":"Xinlin Li;Osama Hanna;Suhas Diggavi;Christina Fragouli","doi":"10.1109/TIT.2025.3560674","DOIUrl":null,"url":null,"abstract":"Quantifying the common information between random variables is a fundamental problem with a long history in information theory. Traditionally, common information is measured in number of bits and thus such measures are mostly informative when the common information is finite. However, the common information between continuous variables can be infinite; in such cases, a real-valued random vector <italic>W</i> may be needed to represent the common information, and to be used for instance for distributed simulation. In this paper, we propose the concept of Common Information Dimension (CID) and three variants. We compute the common information dimension for jointly Gaussian random vectors in a closed form. Moreover, we analytically prove, under two different formulations, that the growth rate of common information in the nearly infinite regime is determined by the common information dimension, for the case of two Gaussian vectors.","PeriodicalId":13494,"journal":{"name":"IEEE Transactions on Information Theory","volume":"71 7","pages":"4915-4938"},"PeriodicalIF":2.9000,"publicationDate":"2025-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Common Information Dimension\",\"authors\":\"Xinlin Li;Osama Hanna;Suhas Diggavi;Christina Fragouli\",\"doi\":\"10.1109/TIT.2025.3560674\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Quantifying the common information between random variables is a fundamental problem with a long history in information theory. Traditionally, common information is measured in number of bits and thus such measures are mostly informative when the common information is finite. However, the common information between continuous variables can be infinite; in such cases, a real-valued random vector <italic>W</i> may be needed to represent the common information, and to be used for instance for distributed simulation. In this paper, we propose the concept of Common Information Dimension (CID) and three variants. We compute the common information dimension for jointly Gaussian random vectors in a closed form. Moreover, we analytically prove, under two different formulations, that the growth rate of common information in the nearly infinite regime is determined by the common information dimension, for the case of two Gaussian vectors.\",\"PeriodicalId\":13494,\"journal\":{\"name\":\"IEEE Transactions on Information Theory\",\"volume\":\"71 7\",\"pages\":\"4915-4938\"},\"PeriodicalIF\":2.9000,\"publicationDate\":\"2025-04-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Information Theory\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10965841/\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Information Theory","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10965841/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
Quantifying the common information between random variables is a fundamental problem with a long history in information theory. Traditionally, common information is measured in number of bits and thus such measures are mostly informative when the common information is finite. However, the common information between continuous variables can be infinite; in such cases, a real-valued random vector W may be needed to represent the common information, and to be used for instance for distributed simulation. In this paper, we propose the concept of Common Information Dimension (CID) and three variants. We compute the common information dimension for jointly Gaussian random vectors in a closed form. Moreover, we analytically prove, under two different formulations, that the growth rate of common information in the nearly infinite regime is determined by the common information dimension, for the case of two Gaussian vectors.
期刊介绍:
The IEEE Transactions on Information Theory is a journal that publishes theoretical and experimental papers concerned with the transmission, processing, and utilization of information. The boundaries of acceptable subject matter are intentionally not sharply delimited. Rather, it is hoped that as the focus of research activity changes, a flexible policy will permit this Transactions to follow suit. Current appropriate topics are best reflected by recent Tables of Contents; they are summarized in the titles of editorial areas that appear on the inside front cover.