{"title":"Refined Kolmogorov complexity of analog, evolving and stochastic recurrent neural networks","authors":"Jérémie Cabessa , Yann Strozecki","doi":"10.1016/j.ins.2025.122104","DOIUrl":null,"url":null,"abstract":"<div><div>Kolmogorov complexity measures the compressibility of real numbers. We provide a refined characterization of the hypercomputational power of analog, evolving, and stochastic neural networks based on the Kolmogorov complexity of their real weights, evolving weights, and real probabilities, respectively. First, we retrieve the infinite hierarchy of complexity classes of analog networks, defined in terms of the Kolmogorov complexity of their real weights. This hierarchy lies between the complexity classes <strong>P</strong> and <span><math><mi>P</mi><mo>/</mo><mi>poly</mi></math></span>. Next, using a natural identification between real numbers and infinite sequences of bits, we generalize this result to evolving networks, obtaining a similar hierarchy of complexity classes within the same bounds. Finally, we extend these results to stochastic networks that employ real probabilities as randomness, deriving a new infinite hierarchy of complexity classes situated between <strong>BPP</strong> and <span><math><mi>BPP</mi><mo>/</mo><mi>lo</mi><msup><mrow><mi>g</mi></mrow><mrow><mo>⁎</mo></mrow></msup></math></span>. Beyond providing examples of such hierarchies, we describe a generic method for constructing them based on classes of functions of increasing complexity. As a practical application, we show that the predictive capabilities of recurrent neural networks are strongly impacted by the quantization applied to their weights. Overall, these results highlight the relationship between the computational power of neural networks and the intrinsic information contained by their parameters.</div></div>","PeriodicalId":51063,"journal":{"name":"Information Sciences","volume":"711 ","pages":"Article 122104"},"PeriodicalIF":8.1000,"publicationDate":"2025-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Sciences","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0020025525002361","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"0","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Kolmogorov complexity measures the compressibility of real numbers. We provide a refined characterization of the hypercomputational power of analog, evolving, and stochastic neural networks based on the Kolmogorov complexity of their real weights, evolving weights, and real probabilities, respectively. First, we retrieve the infinite hierarchy of complexity classes of analog networks, defined in terms of the Kolmogorov complexity of their real weights. This hierarchy lies between the complexity classes P and . Next, using a natural identification between real numbers and infinite sequences of bits, we generalize this result to evolving networks, obtaining a similar hierarchy of complexity classes within the same bounds. Finally, we extend these results to stochastic networks that employ real probabilities as randomness, deriving a new infinite hierarchy of complexity classes situated between BPP and . Beyond providing examples of such hierarchies, we describe a generic method for constructing them based on classes of functions of increasing complexity. As a practical application, we show that the predictive capabilities of recurrent neural networks are strongly impacted by the quantization applied to their weights. Overall, these results highlight the relationship between the computational power of neural networks and the intrinsic information contained by their parameters.
期刊介绍:
Informatics and Computer Science Intelligent Systems Applications is an esteemed international journal that focuses on publishing original and creative research findings in the field of information sciences. We also feature a limited number of timely tutorial and surveying contributions.
Our journal aims to cater to a diverse audience, including researchers, developers, managers, strategic planners, graduate students, and anyone interested in staying up-to-date with cutting-edge research in information science, knowledge engineering, and intelligent systems. While readers are expected to share a common interest in information science, they come from varying backgrounds such as engineering, mathematics, statistics, physics, computer science, cell biology, molecular biology, management science, cognitive science, neurobiology, behavioral sciences, and biochemistry.