短期可塑性和慢时间动态在脑启发递归神经网络增强时间序列预测中的作用。

IF 2.7 2区 数学 Q1 MATHEMATICS, APPLIED
Chaos Pub Date : 2025-02-01 DOI:10.1063/5.0233158
Artem Pilzak, Matias Calderini, Nareg Berberian, Jean-Philippe Thivierge
{"title":"短期可塑性和慢时间动态在脑启发递归神经网络增强时间序列预测中的作用。","authors":"Artem Pilzak, Matias Calderini, Nareg Berberian, Jean-Philippe Thivierge","doi":"10.1063/5.0233158","DOIUrl":null,"url":null,"abstract":"<p><p>Typical reservoir networks are based on random connectivity patterns that differ from brain circuits in two important ways. First, traditional reservoir networks lack synaptic plasticity among recurrent units, whereas cortical networks exhibit plasticity across all neuronal types and cortical layers. Second, reservoir networks utilize random Gaussian connectivity, while cortical networks feature a heavy-tailed distribution of synaptic strengths. It is unclear what are the computational advantages of these features for predicting complex time series. In this study, we integrated short-term plasticity (STP) and lognormal connectivity into a novel recurrent neural network (RNN) framework. The model exhibited rich patterns of population activity characterized by slow coordinated fluctuations. Using graph spectral decomposition, we show that weighted networks with lognormal connectivity and STP yield higher complexity than several graph types. When tested on various tasks involving the prediction of complex time series data, the RNN model outperformed a baseline model with random connectivity as well as several other network architectures. Overall, our results underscore the potential of incorporating brain-inspired features such as STP and heavy-tailed connectivity to enhance the robustness and performance of artificial neural networks in complex data prediction and signal processing tasks.</p>","PeriodicalId":9974,"journal":{"name":"Chaos","volume":"35 2","pages":""},"PeriodicalIF":2.7000,"publicationDate":"2025-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Role of short-term plasticity and slow temporal dynamics in enhancing time series prediction with a brain-inspired recurrent neural network.\",\"authors\":\"Artem Pilzak, Matias Calderini, Nareg Berberian, Jean-Philippe Thivierge\",\"doi\":\"10.1063/5.0233158\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Typical reservoir networks are based on random connectivity patterns that differ from brain circuits in two important ways. First, traditional reservoir networks lack synaptic plasticity among recurrent units, whereas cortical networks exhibit plasticity across all neuronal types and cortical layers. Second, reservoir networks utilize random Gaussian connectivity, while cortical networks feature a heavy-tailed distribution of synaptic strengths. It is unclear what are the computational advantages of these features for predicting complex time series. In this study, we integrated short-term plasticity (STP) and lognormal connectivity into a novel recurrent neural network (RNN) framework. The model exhibited rich patterns of population activity characterized by slow coordinated fluctuations. Using graph spectral decomposition, we show that weighted networks with lognormal connectivity and STP yield higher complexity than several graph types. When tested on various tasks involving the prediction of complex time series data, the RNN model outperformed a baseline model with random connectivity as well as several other network architectures. Overall, our results underscore the potential of incorporating brain-inspired features such as STP and heavy-tailed connectivity to enhance the robustness and performance of artificial neural networks in complex data prediction and signal processing tasks.</p>\",\"PeriodicalId\":9974,\"journal\":{\"name\":\"Chaos\",\"volume\":\"35 2\",\"pages\":\"\"},\"PeriodicalIF\":2.7000,\"publicationDate\":\"2025-02-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Chaos\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1063/5.0233158\",\"RegionNum\":2,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Chaos","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1063/5.0233158","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0

摘要

典型的水库网络是基于随机连接模式的,与大脑回路在两个重要方面有所不同。首先,传统的存储网络在循环单元之间缺乏突触可塑性,而皮层网络在所有神经元类型和皮层层之间表现出可塑性。其次,储层网络利用随机高斯连接,而皮层网络具有突触强度的重尾分布。目前还不清楚这些特征在预测复杂时间序列方面的计算优势。在这项研究中,我们将短期可塑性(STP)和对数正态连接整合到一个新的递归神经网络(RNN)框架中。该模型显示了以缓慢协调波动为特征的丰富的人口活动模式。利用图谱分解,我们证明了具有对数正态连通性和STP的加权网络比几种图类型具有更高的复杂性。在涉及复杂时间序列数据预测的各种任务中进行测试时,RNN模型优于具有随机连接的基线模型以及其他几种网络架构。总的来说,我们的研究结果强调了将STP和重尾连接等大脑启发功能结合起来的潜力,以增强人工神经网络在复杂数据预测和信号处理任务中的鲁棒性和性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Role of short-term plasticity and slow temporal dynamics in enhancing time series prediction with a brain-inspired recurrent neural network.

Typical reservoir networks are based on random connectivity patterns that differ from brain circuits in two important ways. First, traditional reservoir networks lack synaptic plasticity among recurrent units, whereas cortical networks exhibit plasticity across all neuronal types and cortical layers. Second, reservoir networks utilize random Gaussian connectivity, while cortical networks feature a heavy-tailed distribution of synaptic strengths. It is unclear what are the computational advantages of these features for predicting complex time series. In this study, we integrated short-term plasticity (STP) and lognormal connectivity into a novel recurrent neural network (RNN) framework. The model exhibited rich patterns of population activity characterized by slow coordinated fluctuations. Using graph spectral decomposition, we show that weighted networks with lognormal connectivity and STP yield higher complexity than several graph types. When tested on various tasks involving the prediction of complex time series data, the RNN model outperformed a baseline model with random connectivity as well as several other network architectures. Overall, our results underscore the potential of incorporating brain-inspired features such as STP and heavy-tailed connectivity to enhance the robustness and performance of artificial neural networks in complex data prediction and signal processing tasks.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Chaos
Chaos 物理-物理:数学物理
CiteScore
5.20
自引率
13.80%
发文量
448
审稿时长
2.3 months
期刊介绍: Chaos: An Interdisciplinary Journal of Nonlinear Science is a peer-reviewed journal devoted to increasing the understanding of nonlinear phenomena and describing the manifestations in a manner comprehensible to researchers from a broad spectrum of disciplines.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信