{"title":"A functional method on amount of entropy","authors":"H. Umegaki","doi":"10.2996/KMJ/1138844786","DOIUrl":null,"url":null,"abstract":"The theory of information, originated by Shannon, was applied in the new subject to investigate the theory of transformation with invariant measure by Kolmogorov and his school, cf. Rokhlin [12]. Recently, Halmos [7] gave a very clarified note relative to their investigations. While, in order to achieving the channel capacity in stationary finite memory channels, cf. Feinstein [6], some important properties of the entropy (the average amount of information) of information sources in these channels were studied by Khinchin [8], Takano [13], Traregradsky [14], Breiman [2], Parthasarathy [11] and others. The basic space of information sources of the channels is the doubly infinite product set A (the messages space) of the alphabet A, which becomes a compact metric space relative to the weak product topology and in which the shift transformation is a homeomorphism on A (so-called the Bernoulli automorphism).","PeriodicalId":318148,"journal":{"name":"Kodai Mathematical Seminar Reports","volume":"47 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1963-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Kodai Mathematical Seminar Reports","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2996/KMJ/1138844786","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
The theory of information, originated by Shannon, was applied in the new subject to investigate the theory of transformation with invariant measure by Kolmogorov and his school, cf. Rokhlin [12]. Recently, Halmos [7] gave a very clarified note relative to their investigations. While, in order to achieving the channel capacity in stationary finite memory channels, cf. Feinstein [6], some important properties of the entropy (the average amount of information) of information sources in these channels were studied by Khinchin [8], Takano [13], Traregradsky [14], Breiman [2], Parthasarathy [11] and others. The basic space of information sources of the channels is the doubly infinite product set A (the messages space) of the alphabet A, which becomes a compact metric space relative to the weak product topology and in which the shift transformation is a homeomorphism on A (so-called the Bernoulli automorphism).
由香农(Shannon)创立的信息论(theory of information)被运用到新学科中,考察Kolmogorov及其学派的不变测度变换理论,参见Rokhlin[12]。最近,Halmos[7]对他们的研究给出了一个非常明确的说明。而为了在固定有限记忆信道中实现信道容量,参见Feinstein [6], Khinchin[8]、Takano[13]、Traregradsky[14]、Breiman[2]、Parthasarathy[11]等人研究了这些信道中信息源的熵(平均信息量)的一些重要性质。信道的信息源的基本空间是字母A的双无限积集A(消息空间),它相对于弱积拓扑成为紧致度量空间,其中移位变换是A上的同胚(所谓的伯努利自同构)。