Pratap Anbalagan, Zhiguang Feng, Tingwen Huang, Yukang Cui
{"title":"Mean-Square Synchronization of Additive Time-Varying Delayed Markovian Jumping Neural Networks Under Multiple Stochastic Sampling.","authors":"Pratap Anbalagan, Zhiguang Feng, Tingwen Huang, Yukang Cui","doi":"10.1109/TNNLS.2024.3478395","DOIUrl":null,"url":null,"abstract":"<p><p>This study aims to solve the mean-square asymptotic synchronization problem of additive time-varying delayed Markovian jumping neural networks (ATVMJNNs) under the framework of multiple stochastic samplings and its direct application in secure image encryption (SIE). To do this, first, we assume the existence of multiple sampled data periods that satisfy a Bernoulli distribution and introduce random variables to represent the positions of input delays and sampling periods. Then, based on these assumptions, we develop a mode-dependent discontinuous Lyapunov-Krasovskii functional (DLKF) to reduce model conservatism. Next, we introduce a new auxiliary slack-matrix-based integral inequality (ASMBII) to approximate the integral quadratic terms arising from the derivative of the DLKFs. Furthermore, we develop a multiple stochastic sampling framework to achieve asymptotic synchronization between the primary and secondary systems, and less conservative criteria for asymptotic stability in the mean square sense of the error model are derived by solving a set of linear matrix inequalities (LMIs). Finally, we present the numerical validations and corresponding experimental results in a pragmatic application of image processing to demonstrate the benefits of the proposed algorithms and techniques. From both numerical and practical results, the proposed algorithms and techniques can yield superior performance compared to existing studies.</p>","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"PP ","pages":""},"PeriodicalIF":10.2000,"publicationDate":"2025-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1109/TNNLS.2024.3478395","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
This study aims to solve the mean-square asymptotic synchronization problem of additive time-varying delayed Markovian jumping neural networks (ATVMJNNs) under the framework of multiple stochastic samplings and its direct application in secure image encryption (SIE). To do this, first, we assume the existence of multiple sampled data periods that satisfy a Bernoulli distribution and introduce random variables to represent the positions of input delays and sampling periods. Then, based on these assumptions, we develop a mode-dependent discontinuous Lyapunov-Krasovskii functional (DLKF) to reduce model conservatism. Next, we introduce a new auxiliary slack-matrix-based integral inequality (ASMBII) to approximate the integral quadratic terms arising from the derivative of the DLKFs. Furthermore, we develop a multiple stochastic sampling framework to achieve asymptotic synchronization between the primary and secondary systems, and less conservative criteria for asymptotic stability in the mean square sense of the error model are derived by solving a set of linear matrix inequalities (LMIs). Finally, we present the numerical validations and corresponding experimental results in a pragmatic application of image processing to demonstrate the benefits of the proposed algorithms and techniques. From both numerical and practical results, the proposed algorithms and techniques can yield superior performance compared to existing studies.
期刊介绍:
The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.