Claus Metzner, Achim Schilling, Andreas Maier, Patrick Krauss
{"title":"Nonlinear Neural Dynamics and Classification Accuracy in Reservoir Computing.","authors":"Claus Metzner, Achim Schilling, Andreas Maier, Patrick Krauss","doi":"10.1162/neco_a_01770","DOIUrl":null,"url":null,"abstract":"<p><p>Reservoir computing information processing based on untrained recurrent neural networks with random connections is expected to depend on the nonlinear properties of the neurons and the resulting oscillatory, chaotic, or fixed-point dynamics of the network. However, the degree of nonlinearity required and the range of suitable dynamical regimes for a given task remain poorly understood. To clarify these issues, we study the classification accuracy of a reservoir computer in artificial tasks of varying complexity while tuning both the neuron's degree of nonlinearity and the reservoir's dynamical regime. We find that even with activation functions of extremely reduced nonlinearity, weak recurrent interactions, and small input signals, the reservoir can compute useful representations. These representations, detectable only in higher-order principal components, make complex classification tasks linearly separable for the readout layer. Increasing the recurrent coupling leads to spontaneous dynamical behavior. Nevertheless, some input-related computations can \"ride on top\" of oscillatory or fixed-point attractors with little loss of accuracy, whereas chaotic dynamics often reduces task performance. By tuning the system through the full range of dynamical phases, we observe in several classification tasks that accuracy peaks at both the oscillatory/chaotic and chaotic/fixed-point phase boundaries, supporting the edge of chaos hypothesis. We also present a regression task with the opposite behavior. Our findings, particularly the robust weakly nonlinear operating regime, may offer new perspectives for both technical and biological neural networks with random connectivity.</p>","PeriodicalId":54731,"journal":{"name":"Neural Computation","volume":" ","pages":"1469-1504"},"PeriodicalIF":2.7000,"publicationDate":"2025-07-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Computation","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1162/neco_a_01770","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Reservoir computing information processing based on untrained recurrent neural networks with random connections is expected to depend on the nonlinear properties of the neurons and the resulting oscillatory, chaotic, or fixed-point dynamics of the network. However, the degree of nonlinearity required and the range of suitable dynamical regimes for a given task remain poorly understood. To clarify these issues, we study the classification accuracy of a reservoir computer in artificial tasks of varying complexity while tuning both the neuron's degree of nonlinearity and the reservoir's dynamical regime. We find that even with activation functions of extremely reduced nonlinearity, weak recurrent interactions, and small input signals, the reservoir can compute useful representations. These representations, detectable only in higher-order principal components, make complex classification tasks linearly separable for the readout layer. Increasing the recurrent coupling leads to spontaneous dynamical behavior. Nevertheless, some input-related computations can "ride on top" of oscillatory or fixed-point attractors with little loss of accuracy, whereas chaotic dynamics often reduces task performance. By tuning the system through the full range of dynamical phases, we observe in several classification tasks that accuracy peaks at both the oscillatory/chaotic and chaotic/fixed-point phase boundaries, supporting the edge of chaos hypothesis. We also present a regression task with the opposite behavior. Our findings, particularly the robust weakly nonlinear operating regime, may offer new perspectives for both technical and biological neural networks with random connectivity.
期刊介绍:
Neural Computation is uniquely positioned at the crossroads between neuroscience and TMCS and welcomes the submission of original papers from all areas of TMCS, including: Advanced experimental design; Analysis of chemical sensor data; Connectomic reconstructions; Analysis of multielectrode and optical recordings; Genetic data for cell identity; Analysis of behavioral data; Multiscale models; Analysis of molecular mechanisms; Neuroinformatics; Analysis of brain imaging data; Neuromorphic engineering; Principles of neural coding, computation, circuit dynamics, and plasticity; Theories of brain function.