Christian Keup, Tobias Kühn, David Dahmen, M. Helias
{"title":"Transient Chaotic Dimensionality Expansion by Recurrent Networks","authors":"Christian Keup, Tobias Kühn, David Dahmen, M. Helias","doi":"10.1103/PhysRevX.11.021064","DOIUrl":null,"url":null,"abstract":"Cortical neurons communicate with spikes, which are discrete events in time. Even if the timings of the individual events are strongly chaotic (microscopic chaos), the rate of events might still be non-chaotic or at the edge of what is known as rate chaos. Such edge-of-chaos dynamics are beneficial to the computational power of neuronal networks. We analyze both types of chaotic dynamics in densely connected networks of asynchronous binary neurons, by developing and applying a model-independent field theory for neuronal networks. We find a strongly size-dependent transition to microscopic chaos. We then expose the conceptual difficulty at the heart of the definition of rate chaos, identify two reasonable definitions, and show that for neither of them the binary network dynamics crosses a transition to rate chaos. \nThe analysis of diverging trajectories in chaotic networks also allows us to study classification of linearly non-separable classes of stimuli in a reservoir computing approach. We show that microscopic chaos rapidly expands the dimensionality of the representation while, crucially, the number of dimensions corrupted by noise lags behind. This translates to a transient peak in the networks' classification performance even deeply in the chaotic regime, challenging the view that computational performance is always optimal near the edge of chaos. This is a general effect in high dimensional chaotic systems, and not specific to binary networks: We also demonstrate it in a continuous 'rate' network, a spiking LIF network, and an LSTM network. For binary and LIF networks, classification performance peaks rapidly within one activation per participating neuron, demonstrating fast event-based computation that may be exploited by biological neural systems, for which we propose testable predictions.","PeriodicalId":8438,"journal":{"name":"arXiv: Disordered Systems and Neural Networks","volume":"102 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2020-02-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"14","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv: Disordered Systems and Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1103/PhysRevX.11.021064","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 14
Abstract
Cortical neurons communicate with spikes, which are discrete events in time. Even if the timings of the individual events are strongly chaotic (microscopic chaos), the rate of events might still be non-chaotic or at the edge of what is known as rate chaos. Such edge-of-chaos dynamics are beneficial to the computational power of neuronal networks. We analyze both types of chaotic dynamics in densely connected networks of asynchronous binary neurons, by developing and applying a model-independent field theory for neuronal networks. We find a strongly size-dependent transition to microscopic chaos. We then expose the conceptual difficulty at the heart of the definition of rate chaos, identify two reasonable definitions, and show that for neither of them the binary network dynamics crosses a transition to rate chaos.
The analysis of diverging trajectories in chaotic networks also allows us to study classification of linearly non-separable classes of stimuli in a reservoir computing approach. We show that microscopic chaos rapidly expands the dimensionality of the representation while, crucially, the number of dimensions corrupted by noise lags behind. This translates to a transient peak in the networks' classification performance even deeply in the chaotic regime, challenging the view that computational performance is always optimal near the edge of chaos. This is a general effect in high dimensional chaotic systems, and not specific to binary networks: We also demonstrate it in a continuous 'rate' network, a spiking LIF network, and an LSTM network. For binary and LIF networks, classification performance peaks rapidly within one activation per participating neuron, demonstrating fast event-based computation that may be exploited by biological neural systems, for which we propose testable predictions.