{"title":"使用延迟参数松弛矩阵的延迟神经网络松弛稳定性标准。","authors":"","doi":"10.1016/j.neunet.2024.106676","DOIUrl":null,"url":null,"abstract":"<div><p>This note aims to reduce the conservatism of stability criteria for neural networks with time-varying delay. To this goal, on the one hand, we construct an augmented Lyapunov–Krasovskii functional (LKF), incorporating some delay-product terms that capture more information about neural states. On the other hand, when dealing with the derivative of the LKF, we introduce several <em>parameter-dependent slack matrices</em> into an affine integral inequality, zero equations, and the <span><math><mi>S</mi></math></span>-procedure. As a result, more relaxed stability criteria are obtained by employing the so-called Lyapunov–Krasovskii Theorem. Two numerical examples show that the proposed stability criteria are of less conservatism compared with some existing methods.</p></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":null,"pages":null},"PeriodicalIF":6.0000,"publicationDate":"2024-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Relaxed stability criteria of delayed neural networks using delay-parameters-dependent slack matrices\",\"authors\":\"\",\"doi\":\"10.1016/j.neunet.2024.106676\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>This note aims to reduce the conservatism of stability criteria for neural networks with time-varying delay. To this goal, on the one hand, we construct an augmented Lyapunov–Krasovskii functional (LKF), incorporating some delay-product terms that capture more information about neural states. On the other hand, when dealing with the derivative of the LKF, we introduce several <em>parameter-dependent slack matrices</em> into an affine integral inequality, zero equations, and the <span><math><mi>S</mi></math></span>-procedure. As a result, more relaxed stability criteria are obtained by employing the so-called Lyapunov–Krasovskii Theorem. Two numerical examples show that the proposed stability criteria are of less conservatism compared with some existing methods.</p></div>\",\"PeriodicalId\":49763,\"journal\":{\"name\":\"Neural Networks\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":6.0000,\"publicationDate\":\"2024-08-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Networks\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0893608024006002\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608024006002","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
摘要
本论文旨在降低具有时变延迟的神经网络稳定性标准的保守性。为此,一方面,我们构建了一个增强的李雅普诺夫-克拉索夫斯基函数(LKF),其中包含了一些能捕捉更多神经状态信息的延迟积项。另一方面,在处理 LKF 的导数时,我们在仿射积分不等式、零方程和 S 过程中引入了几个与参数相关的松弛矩阵。因此,通过使用所谓的 Lyapunov-Krasovskii 定理,可以获得更宽松的稳定性标准。两个数值实例表明,与现有的一些方法相比,所提出的稳定性标准的保守性更低。
Relaxed stability criteria of delayed neural networks using delay-parameters-dependent slack matrices
This note aims to reduce the conservatism of stability criteria for neural networks with time-varying delay. To this goal, on the one hand, we construct an augmented Lyapunov–Krasovskii functional (LKF), incorporating some delay-product terms that capture more information about neural states. On the other hand, when dealing with the derivative of the LKF, we introduce several parameter-dependent slack matrices into an affine integral inequality, zero equations, and the -procedure. As a result, more relaxed stability criteria are obtained by employing the so-called Lyapunov–Krasovskii Theorem. Two numerical examples show that the proposed stability criteria are of less conservatism compared with some existing methods.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.