{"title":"Increasing the Gap between Descriptional Complexity and Algorithmic Probability","authors":"A. Day","doi":"10.1109/CCC.2009.13","DOIUrl":null,"url":null,"abstract":"The coding theorem is a fundamental result of algorithmic information theory. A well known theorem of Gács shows that the analog of the coding theorem fails for continuous sample spaces. This means that descriptional monotonic complexity does not coincide within an additive constant with the negative logarithm of algorithmic probability. Gács's proof provided a lower bound on the difference between these values. He showed that for infinitely many finite binary strings, this difference was greater than a version of the inverse Ackermann function applied to string length. This paper establishes that this lower bound can be substantially improved. The inverse Ackermann function can be replaced with a function O(log(log(x))). This shows that in continuous sample spaces, descriptional monotonic complexity and algorithmic probability are very different. While this proof builds on the original work by Gács, it does have a number of new features, in particular, the algorithm at the heart of the proof works on sets of strings as opposed to individual strings.","PeriodicalId":158572,"journal":{"name":"2009 24th Annual IEEE Conference on Computational Complexity","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 24th Annual IEEE Conference on Computational Complexity","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCC.2009.13","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7
Abstract
The coding theorem is a fundamental result of algorithmic information theory. A well known theorem of Gács shows that the analog of the coding theorem fails for continuous sample spaces. This means that descriptional monotonic complexity does not coincide within an additive constant with the negative logarithm of algorithmic probability. Gács's proof provided a lower bound on the difference between these values. He showed that for infinitely many finite binary strings, this difference was greater than a version of the inverse Ackermann function applied to string length. This paper establishes that this lower bound can be substantially improved. The inverse Ackermann function can be replaced with a function O(log(log(x))). This shows that in continuous sample spaces, descriptional monotonic complexity and algorithmic probability are very different. While this proof builds on the original work by Gács, it does have a number of new features, in particular, the algorithm at the heart of the proof works on sets of strings as opposed to individual strings.