{"title":"Generalization of neural network models for complex network dynamics","authors":"Vaiva Vasiliauskaite, Nino Antulov-Fantulin","doi":"10.1038/s42005-024-01837-w","DOIUrl":null,"url":null,"abstract":"Differential equations are a ubiquitous tool to study dynamics, ranging from physical systems to complex systems, where a large number of agents interact through a graph. Data-driven approximations of differential equations present a promising alternative to traditional methods for uncovering a model of dynamical systems, especially in complex systems that lack explicit first principles. A recently employed machine learning tool for studying dynamics is neural networks, which can be used for solution finding or discovery of differential equations. However, deploying deep learning models in unfamiliar settings-such as predicting dynamics in unobserved state space regions or on novel graphs-can lead to spurious results. Focusing on complex systems whose dynamics are described with a system of first-order differential equations coupled through a graph, we study generalization of neural network predictions in settings where statistical properties of test data and training data are different. We find that neural networks can accurately predict dynamics beyond the immediate training setting within the domain of the training data. To identify when a model is unable to generalize to novel settings, we propose a statistical significance test. Deep learning is a promising alternative to traditional methods for discovering governing equations, such as variational and perturbation methods, or data-driven approaches like symbolic regression. This paper explores the generalization of neural approximations of dynamics on complex networks to novel, unobserved settings and proposes a statistical testing framework to quantify confidence in the inferred predictions.","PeriodicalId":10540,"journal":{"name":"Communications Physics","volume":" ","pages":"1-10"},"PeriodicalIF":5.4000,"publicationDate":"2024-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.com/articles/s42005-024-01837-w.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Communications Physics","FirstCategoryId":"101","ListUrlMain":"https://www.nature.com/articles/s42005-024-01837-w","RegionNum":1,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PHYSICS, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
Differential equations are a ubiquitous tool to study dynamics, ranging from physical systems to complex systems, where a large number of agents interact through a graph. Data-driven approximations of differential equations present a promising alternative to traditional methods for uncovering a model of dynamical systems, especially in complex systems that lack explicit first principles. A recently employed machine learning tool for studying dynamics is neural networks, which can be used for solution finding or discovery of differential equations. However, deploying deep learning models in unfamiliar settings-such as predicting dynamics in unobserved state space regions or on novel graphs-can lead to spurious results. Focusing on complex systems whose dynamics are described with a system of first-order differential equations coupled through a graph, we study generalization of neural network predictions in settings where statistical properties of test data and training data are different. We find that neural networks can accurately predict dynamics beyond the immediate training setting within the domain of the training data. To identify when a model is unable to generalize to novel settings, we propose a statistical significance test. Deep learning is a promising alternative to traditional methods for discovering governing equations, such as variational and perturbation methods, or data-driven approaches like symbolic regression. This paper explores the generalization of neural approximations of dynamics on complex networks to novel, unobserved settings and proposes a statistical testing framework to quantify confidence in the inferred predictions.
期刊介绍:
Communications Physics is an open access journal from Nature Research publishing high-quality research, reviews and commentary in all areas of the physical sciences. Research papers published by the journal represent significant advances bringing new insight to a specialized area of research in physics. We also aim to provide a community forum for issues of importance to all physicists, regardless of sub-discipline.
The scope of the journal covers all areas of experimental, applied, fundamental, and interdisciplinary physical sciences. Primary research published in Communications Physics includes novel experimental results, new techniques or computational methods that may influence the work of others in the sub-discipline. We also consider submissions from adjacent research fields where the central advance of the study is of interest to physicists, for example material sciences, physical chemistry and technologies.