Neural Networks and Equilibria, Synchronization, and Time Lags

D. Danciu, V. Răsvan
{"title":"Neural Networks and Equilibria, Synchronization, and Time Lags","authors":"D. Danciu, V. Răsvan","doi":"10.4018/978-1-59904-849-9.CH178","DOIUrl":null,"url":null,"abstract":"All neural networks, both natural and artificial, are characterized by two kinds of dynamics. The first one is concerned with what we would call “learning dynamics”, in fact the sequential (discrete time) dynamics of the choice of synaptic weights. The second one is the intrinsic dynamics of the neural network viewed as a dynamical system after the weights have been established via learning. Regarding the second dynamics, the emergent computational capabilities of a recurrent neural network can be achieved provided it has many equilibria. The network task is achieved provided it approaches these equilibria. But the dynamical system has a dynamics induced a posteriori by the learning process that had established the synaptic weights. It is not compulsory that this a posteriori dynamics should have the required properties, hence they have to be checked separately. The standard stability properties (Lyapunov, asymptotic and exponential stability) are defined for a single equilibrium. Their counterpart for several equilibria are: mutability, global asymptotics, gradient behavior. For the definitions of these general concepts the reader is sent to Gelig et. al., (1978), Leonov et. al., (1992). In the last decades, the number of recurrent neural networks’ applications increased, they being designed for classification, identification and complex image, visual and spatio-temporal processing in fields as engineering, chemistry, biology and medicine (see, for instance: Fortuna et. al., 2001; Fink, 2004; Atencia et. al., 2004; Iwahori et. al., 2005; Maurer et. al., 2005; Guirguis & Ghoneimy, 2007). All these applications are mainly based on the existence of several equilibria for such networks, requiring them the “good behavior” properties above discussed. Another aspect of the qualitative analysis is the so-called synchronization problem, when an external stimulus, in most cases periodic or almost periodic has to be tracked (Gelig, 1982; Danciu, 2002). This problem is, from the mathematical point of view, nothing more but existence, uniqueness and global stability of forced oscillations. In the last decades the neural networks dynamics models have been modified once more by introducing the transmission delays. The standard model of a Hopfield-type network with delay as considered in (Gopalsamy & He, 1994) is","PeriodicalId":320314,"journal":{"name":"Encyclopedia of Artificial Intelligence","volume":"22 3","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Encyclopedia of Artificial Intelligence","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4018/978-1-59904-849-9.CH178","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

All neural networks, both natural and artificial, are characterized by two kinds of dynamics. The first one is concerned with what we would call “learning dynamics”, in fact the sequential (discrete time) dynamics of the choice of synaptic weights. The second one is the intrinsic dynamics of the neural network viewed as a dynamical system after the weights have been established via learning. Regarding the second dynamics, the emergent computational capabilities of a recurrent neural network can be achieved provided it has many equilibria. The network task is achieved provided it approaches these equilibria. But the dynamical system has a dynamics induced a posteriori by the learning process that had established the synaptic weights. It is not compulsory that this a posteriori dynamics should have the required properties, hence they have to be checked separately. The standard stability properties (Lyapunov, asymptotic and exponential stability) are defined for a single equilibrium. Their counterpart for several equilibria are: mutability, global asymptotics, gradient behavior. For the definitions of these general concepts the reader is sent to Gelig et. al., (1978), Leonov et. al., (1992). In the last decades, the number of recurrent neural networks’ applications increased, they being designed for classification, identification and complex image, visual and spatio-temporal processing in fields as engineering, chemistry, biology and medicine (see, for instance: Fortuna et. al., 2001; Fink, 2004; Atencia et. al., 2004; Iwahori et. al., 2005; Maurer et. al., 2005; Guirguis & Ghoneimy, 2007). All these applications are mainly based on the existence of several equilibria for such networks, requiring them the “good behavior” properties above discussed. Another aspect of the qualitative analysis is the so-called synchronization problem, when an external stimulus, in most cases periodic or almost periodic has to be tracked (Gelig, 1982; Danciu, 2002). This problem is, from the mathematical point of view, nothing more but existence, uniqueness and global stability of forced oscillations. In the last decades the neural networks dynamics models have been modified once more by introducing the transmission delays. The standard model of a Hopfield-type network with delay as considered in (Gopalsamy & He, 1994) is
神经网络与平衡、同步和时间滞后
所有的神经网络,无论是自然的还是人工的,都具有两种动态特征。第一个是我们所说的“学习动力学”,实际上是选择突触权重的顺序(离散时间)动力学。第二种是神经网络在通过学习建立权值后作为一个动态系统的内在动力学。对于第二种动态,只要递归神经网络具有多个平衡点,就可以实现递归神经网络的紧急计算能力。当网络任务接近这些平衡点时,网络任务就实现了。但是动力系统有一种后天的动态由建立突触权重的学习过程引起。这种后验动态并不强制要求具有所需的属性,因此它们必须单独检查。定义了单平衡的标准稳定性性质(Lyapunov稳定性、渐近稳定性和指数稳定性)。它们对应的均衡有:可变性、全局渐近性、梯度行为。对于这些一般概念的定义,读者可以参考Gelig et. al. (1978), Leonov et. al.(1992)。在过去的几十年里,循环神经网络的应用数量增加了,它们被设计用于分类、识别和复杂的图像、视觉和时空处理,在工程、化学、生物和医学等领域(参见:Fortuna等人,2001;芬克,2004;Atencia等人,2004;Iwahori et al., 2005;Maurer et al., 2005;Guirguis & ghonemy, 2007)。所有这些应用主要是基于这些网络的几个平衡点的存在,要求它们具有上面讨论的“良好行为”性质。定性分析的另一个方面是所谓的同步问题,当外部刺激,在大多数情况下周期性或几乎周期性必须跟踪(Gelig, 1982;Danciu, 2002)。从数学的角度来看,这个问题无非是强迫振荡的存在性、唯一性和全局稳定性问题。在过去的几十年里,神经网络动力学模型又一次被修正,引入了传输延迟。(Gopalsamy & He, 1994)中考虑的hopfield型延迟网络的标准模型为
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信