Increasing the Gap between Descriptional Complexity and Algorithmic Probability

A. Day
{"title":"Increasing the Gap between Descriptional Complexity and Algorithmic Probability","authors":"A. Day","doi":"10.1109/CCC.2009.13","DOIUrl":null,"url":null,"abstract":"The coding theorem is a fundamental result of algorithmic information theory. A well known theorem of Gács shows that the analog of the coding theorem fails for continuous sample spaces. This means that descriptional monotonic complexity does not coincide within an additive constant with the negative logarithm of algorithmic probability. Gács's proof provided a lower bound on the difference between these values. He showed that for infinitely many finite binary strings, this difference was greater than a version of the inverse Ackermann function applied to string length. This paper establishes that this lower bound can be substantially improved. The inverse Ackermann function can be replaced with a function O(log(log(x))). This shows that in continuous sample spaces, descriptional monotonic complexity and algorithmic probability are very different. While this proof builds on the original work by Gács, it does have a number of new features, in particular, the algorithm at the heart of the proof works on sets of strings as opposed to individual strings.","PeriodicalId":158572,"journal":{"name":"2009 24th Annual IEEE Conference on Computational Complexity","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 24th Annual IEEE Conference on Computational Complexity","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCC.2009.13","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7

Abstract

The coding theorem is a fundamental result of algorithmic information theory. A well known theorem of Gács shows that the analog of the coding theorem fails for continuous sample spaces. This means that descriptional monotonic complexity does not coincide within an additive constant with the negative logarithm of algorithmic probability. Gács's proof provided a lower bound on the difference between these values. He showed that for infinitely many finite binary strings, this difference was greater than a version of the inverse Ackermann function applied to string length. This paper establishes that this lower bound can be substantially improved. The inverse Ackermann function can be replaced with a function O(log(log(x))). This shows that in continuous sample spaces, descriptional monotonic complexity and algorithmic probability are very different. While this proof builds on the original work by Gács, it does have a number of new features, in particular, the algorithm at the heart of the proof works on sets of strings as opposed to individual strings.
增加描述复杂性和算法概率之间的差距
编码定理是算法信息论的一个基本结论。一个著名的Gács定理表明,编码定理的类比对于连续的样本空间是不成立的。这意味着,描述性单调复杂性并不与算法概率的负对数加性常数重合。Gács的证明提供了这些值之差的下界。他证明了对于无限多个有限二进制字符串,这种差异大于应用于字符串长度的逆阿克曼函数的一个版本。本文证明了这个下界可以得到很大的改进。逆Ackermann函数可以用函数O(log(log(x))代替。这表明在连续样本空间中,描述单调复杂度和算法概率有很大的不同。虽然这个证明建立在Gács的原始工作的基础上,但它确实有许多新特性,特别是,证明核心的算法适用于字符串集合,而不是单个字符串。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信