{"title":"Information theory for complex systems scientists: What, why, and how","authors":"Thomas F. Varley","doi":"10.1016/j.physrep.2025.09.007","DOIUrl":null,"url":null,"abstract":"<div><div>In the 21st century, many of the crucial scientific and technical issues facing humanity can be understood as problems associated with understanding, modeling, and ultimately controlling <em>complex systems</em>: systems comprised of a large number of non-trivially interacting components whose collective behavior can be difficult to predict. Information theory, a branch of mathematics historically associated with questions about encoding and decoding messages, has emerged as something of a <em>lingua franca</em> for those studying complex systems, far exceeding its original narrow domain of communication systems engineering. In the context of complexity science, information theory provides a set of tools which allow researchers to describe a variety of dependencies, including interactions between the component parts of a system, interactions between a system and its environment, and the mereological interaction between the parts and the “whole”.</div><div>In this review aims to provide an accessible introduction to the core of modern information theory, aimed specifically at aspiring (and established) complex systems scientists. This includes standard measures, such as Shannon entropy, relative entropy, and mutual information, before building to more advanced topics, including: information dynamics, measures of statistical complexity, information decomposition, and effective network inference. In addition to detailing the formal definitions, we also make an effort to discuss how information theory can be <em>interpreted</em> and to develop the intuitions behind abstract concepts like “entropy”. The goal is to enable interested readers to understand what information is, and how it is used to better further their own research and education.</div></div>","PeriodicalId":404,"journal":{"name":"Physics Reports","volume":"1148 ","pages":"Pages 1-55"},"PeriodicalIF":29.5000,"publicationDate":"2025-10-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Physics Reports","FirstCategoryId":"4","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S037015732500256X","RegionNum":1,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PHYSICS, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
In the 21st century, many of the crucial scientific and technical issues facing humanity can be understood as problems associated with understanding, modeling, and ultimately controlling complex systems: systems comprised of a large number of non-trivially interacting components whose collective behavior can be difficult to predict. Information theory, a branch of mathematics historically associated with questions about encoding and decoding messages, has emerged as something of a lingua franca for those studying complex systems, far exceeding its original narrow domain of communication systems engineering. In the context of complexity science, information theory provides a set of tools which allow researchers to describe a variety of dependencies, including interactions between the component parts of a system, interactions between a system and its environment, and the mereological interaction between the parts and the “whole”.
In this review aims to provide an accessible introduction to the core of modern information theory, aimed specifically at aspiring (and established) complex systems scientists. This includes standard measures, such as Shannon entropy, relative entropy, and mutual information, before building to more advanced topics, including: information dynamics, measures of statistical complexity, information decomposition, and effective network inference. In addition to detailing the formal definitions, we also make an effort to discuss how information theory can be interpreted and to develop the intuitions behind abstract concepts like “entropy”. The goal is to enable interested readers to understand what information is, and how it is used to better further their own research and education.
期刊介绍:
Physics Reports keeps the active physicist up-to-date on developments in a wide range of topics by publishing timely reviews which are more extensive than just literature surveys but normally less than a full monograph. Each report deals with one specific subject and is generally published in a separate volume. These reviews are specialist in nature but contain enough introductory material to make the main points intelligible to a non-specialist. The reader will not only be able to distinguish important developments and trends in physics but will also find a sufficient number of references to the original literature.