G Soundararajan, R Suvetha, Minvydas Ragulskis, P Prakash
{"title":"Output sampling synchronization and state estimation in flux-charge domain memristive neural networks with leakage and time-varying delays.","authors":"G Soundararajan, R Suvetha, Minvydas Ragulskis, P Prakash","doi":"10.1016/j.neunet.2024.107018","DOIUrl":null,"url":null,"abstract":"<p><p>This paper theoretically explores the coexistence of synchronization and state estimation analysis through output sampling measures for a class of memristive neural networks operating within the flux-charge domain. These networks are subject to constant delayed responses in self-feedback loops and time-varying delayed responses incorporated into the activation functions. A contemporary output sampling controller is designed to discretize system dynamics based on available output measurements, which enhances control performance by minimizing update frequency, thus overcoming network bandwidth limitations and addressing network synchronization and state vector estimation. By utilizing differential inclusion mapping to capture weights from discontinuous memristive switching actions and an input-delay approach to bound nonuniform sampling intervals, we present linear matrix inequality-based sufficient conditions for synchronization and vector estimation criteria under the Lyapunov-Krasovskii functional framework and relaxed integral inequality. Finally, by utilizing the preset experimental data-set, we visually verify the adaptability of the proposed theoretical findings concerning synchronization, anti-synchronization, and vector state estimation of delayed memristive neural networks operating in the flux-charge domain. Furthermore, numerical validation through simulation demonstrates the impact of leakage delay and output measurement sampling by comparative analysis with scenarios lacking leakage and sampling measurements.</p>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"184 ","pages":"107018"},"PeriodicalIF":6.0000,"publicationDate":"2024-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1016/j.neunet.2024.107018","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
This paper theoretically explores the coexistence of synchronization and state estimation analysis through output sampling measures for a class of memristive neural networks operating within the flux-charge domain. These networks are subject to constant delayed responses in self-feedback loops and time-varying delayed responses incorporated into the activation functions. A contemporary output sampling controller is designed to discretize system dynamics based on available output measurements, which enhances control performance by minimizing update frequency, thus overcoming network bandwidth limitations and addressing network synchronization and state vector estimation. By utilizing differential inclusion mapping to capture weights from discontinuous memristive switching actions and an input-delay approach to bound nonuniform sampling intervals, we present linear matrix inequality-based sufficient conditions for synchronization and vector estimation criteria under the Lyapunov-Krasovskii functional framework and relaxed integral inequality. Finally, by utilizing the preset experimental data-set, we visually verify the adaptability of the proposed theoretical findings concerning synchronization, anti-synchronization, and vector state estimation of delayed memristive neural networks operating in the flux-charge domain. Furthermore, numerical validation through simulation demonstrates the impact of leakage delay and output measurement sampling by comparative analysis with scenarios lacking leakage and sampling measurements.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.