Vincent Painchaud;Patrick Desrosiers;Nicolas Doyon
{"title":"随机神经元大型网络中协方差的决定性作用","authors":"Vincent Painchaud;Patrick Desrosiers;Nicolas Doyon","doi":"10.1162/neco_a_01656","DOIUrl":null,"url":null,"abstract":"Biological neural networks are notoriously hard to model due to their stochastic behavior and high dimensionality. We tackle this problem by constructing a dynamical model of both the expectations and covariances of the fractions of active and refractory neurons in the network’s populations. We do so by describing the evolution of the states of individual neurons with a continuous-time Markov chain, from which we formally derive a low-dimensional dynamical system. This is done by solving a moment closure problem in a way that is compatible with the nonlinearity and boundedness of the activation function. Our dynamical system captures the behavior of the high-dimensional stochastic model even in cases where the mean-field approximation fails to do so. Taking into account the second-order moments modifies the solutions that would be obtained with the mean-field approximation and can lead to the appearance or disappearance of fixed points and limit cycles. We moreover perform numerical experiments where the mean-field approximation leads to periodically oscillating solutions, while the solutions of the second-order model can be interpreted as an average taken over many realizations of the stochastic model. Altogether, our results highlight the importance of including higher moments when studying stochastic networks and deepen our understanding of correlated neuronal activity.","PeriodicalId":54731,"journal":{"name":"Neural Computation","volume":"36 6","pages":"1121-1162"},"PeriodicalIF":2.7000,"publicationDate":"2024-03-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The Determining Role of Covariances in Large Networks of Stochastic Neurons\",\"authors\":\"Vincent Painchaud;Patrick Desrosiers;Nicolas Doyon\",\"doi\":\"10.1162/neco_a_01656\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Biological neural networks are notoriously hard to model due to their stochastic behavior and high dimensionality. We tackle this problem by constructing a dynamical model of both the expectations and covariances of the fractions of active and refractory neurons in the network’s populations. We do so by describing the evolution of the states of individual neurons with a continuous-time Markov chain, from which we formally derive a low-dimensional dynamical system. This is done by solving a moment closure problem in a way that is compatible with the nonlinearity and boundedness of the activation function. Our dynamical system captures the behavior of the high-dimensional stochastic model even in cases where the mean-field approximation fails to do so. Taking into account the second-order moments modifies the solutions that would be obtained with the mean-field approximation and can lead to the appearance or disappearance of fixed points and limit cycles. We moreover perform numerical experiments where the mean-field approximation leads to periodically oscillating solutions, while the solutions of the second-order model can be interpreted as an average taken over many realizations of the stochastic model. Altogether, our results highlight the importance of including higher moments when studying stochastic networks and deepen our understanding of correlated neuronal activity.\",\"PeriodicalId\":54731,\"journal\":{\"name\":\"Neural Computation\",\"volume\":\"36 6\",\"pages\":\"1121-1162\"},\"PeriodicalIF\":2.7000,\"publicationDate\":\"2024-03-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Computation\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10661263/\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Computation","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10661263/","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
The Determining Role of Covariances in Large Networks of Stochastic Neurons
Biological neural networks are notoriously hard to model due to their stochastic behavior and high dimensionality. We tackle this problem by constructing a dynamical model of both the expectations and covariances of the fractions of active and refractory neurons in the network’s populations. We do so by describing the evolution of the states of individual neurons with a continuous-time Markov chain, from which we formally derive a low-dimensional dynamical system. This is done by solving a moment closure problem in a way that is compatible with the nonlinearity and boundedness of the activation function. Our dynamical system captures the behavior of the high-dimensional stochastic model even in cases where the mean-field approximation fails to do so. Taking into account the second-order moments modifies the solutions that would be obtained with the mean-field approximation and can lead to the appearance or disappearance of fixed points and limit cycles. We moreover perform numerical experiments where the mean-field approximation leads to periodically oscillating solutions, while the solutions of the second-order model can be interpreted as an average taken over many realizations of the stochastic model. Altogether, our results highlight the importance of including higher moments when studying stochastic networks and deepen our understanding of correlated neuronal activity.
期刊介绍:
Neural Computation is uniquely positioned at the crossroads between neuroscience and TMCS and welcomes the submission of original papers from all areas of TMCS, including: Advanced experimental design; Analysis of chemical sensor data; Connectomic reconstructions; Analysis of multielectrode and optical recordings; Genetic data for cell identity; Analysis of behavioral data; Multiscale models; Analysis of molecular mechanisms; Neuroinformatics; Analysis of brain imaging data; Neuromorphic engineering; Principles of neural coding, computation, circuit dynamics, and plasticity; Theories of brain function.