Tipping prediction of a class of large-scale radial-ring neural networks

IF 6 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Yunxiang Lu , Min Xiao , Xiaoqun Wu , Hamid Reza Karimi , Xiangpeng Xie , Jinde Cao , Wei Xing Zheng
{"title":"Tipping prediction of a class of large-scale radial-ring neural networks","authors":"Yunxiang Lu ,&nbsp;Min Xiao ,&nbsp;Xiaoqun Wu ,&nbsp;Hamid Reza Karimi ,&nbsp;Xiangpeng Xie ,&nbsp;Jinde Cao ,&nbsp;Wei Xing Zheng","doi":"10.1016/j.neunet.2024.106820","DOIUrl":null,"url":null,"abstract":"<div><div>Understanding the emergence and evolution of collective dynamics in large-scale neural networks remains a complex challenge. This paper seeks to address this gap by applying dynamical systems theory, with a particular focus on tipping mechanisms. First, we introduce a novel <span><math><mrow><mo>(</mo><mi>n</mi><mo>+</mo><mi>m</mi><mi>n</mi><mo>)</mo></mrow></math></span>-scale radial-ring neural network and employ Coates’ flow graph topological approach to derive the characteristic equation of the linearized network. Second, through deriving stability conditions and predicting the tipping point using an algebraic approach based on the integral element concept, we identify critical factors such as the synaptic transmission delay, the self-feedback coefficient, and the network topology. Finally, we validate the methodology’s effectiveness in predicting the tipping point. The findings reveal that increased synaptic transmission delay can induce and amplify periodic oscillations. Additionally, the self-feedback coefficient and the network topology influence the onset of tipping points. Moreover, the selection of activation function impacts both the number of equilibrium solutions and the convergence speed of the neural network. Lastly, we demonstrate that the proposed large-scale radial-ring neural network exhibits stronger robustness compared to lower-scale networks with a single topology. The results provide a comprehensive depiction of the dynamics observed in large-scale neural networks under the influence of various factor combinations.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"181 ","pages":"Article 106820"},"PeriodicalIF":6.0000,"publicationDate":"2024-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608024007445","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Understanding the emergence and evolution of collective dynamics in large-scale neural networks remains a complex challenge. This paper seeks to address this gap by applying dynamical systems theory, with a particular focus on tipping mechanisms. First, we introduce a novel (n+mn)-scale radial-ring neural network and employ Coates’ flow graph topological approach to derive the characteristic equation of the linearized network. Second, through deriving stability conditions and predicting the tipping point using an algebraic approach based on the integral element concept, we identify critical factors such as the synaptic transmission delay, the self-feedback coefficient, and the network topology. Finally, we validate the methodology’s effectiveness in predicting the tipping point. The findings reveal that increased synaptic transmission delay can induce and amplify periodic oscillations. Additionally, the self-feedback coefficient and the network topology influence the onset of tipping points. Moreover, the selection of activation function impacts both the number of equilibrium solutions and the convergence speed of the neural network. Lastly, we demonstrate that the proposed large-scale radial-ring neural network exhibits stronger robustness compared to lower-scale networks with a single topology. The results provide a comprehensive depiction of the dynamics observed in large-scale neural networks under the influence of various factor combinations.
一类大规模径向环神经网络的临界预测
理解大规模神经网络中集体动力学的出现和演化仍然是一项复杂的挑战。本文试图通过应用动力系统理论来弥补这一不足,并特别关注临界机制。首先,我们引入了一个新颖的(n+mn)尺度径向环状神经网络,并采用科茨的流图拓扑方法推导出线性化网络的特征方程。其次,通过推导稳定性条件,并使用基于积分元素概念的代数方法预测临界点,我们确定了突触传输延迟、自反馈系数和网络拓扑结构等关键因素。最后,我们验证了该方法在预测临界点方面的有效性。研究结果表明,突触传递延迟的增加会诱发和放大周期性振荡。此外,自反馈系数和网络拓扑结构也会影响临界点的出现。此外,激活函数的选择也会影响平衡解的数量和神经网络的收敛速度。最后,我们证明了与单一拓扑结构的低尺度网络相比,所提出的大规模径向环形神经网络具有更强的鲁棒性。这些结果全面描述了大规模神经网络在各种因素组合影响下的动态变化。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Neural Networks
Neural Networks 工程技术-计算机:人工智能
CiteScore
13.90
自引率
7.70%
发文量
425
审稿时长
67 days
期刊介绍: Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信