Multiscroll hidden attractor in memristive autapse neuron model and its memristor-based scroll control and application in image encryption

IF 6 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Zhiqiang Wan , Yi-Fei Pu , Qiang Lai
{"title":"Multiscroll hidden attractor in memristive autapse neuron model and its memristor-based scroll control and application in image encryption","authors":"Zhiqiang Wan ,&nbsp;Yi-Fei Pu ,&nbsp;Qiang Lai","doi":"10.1016/j.neunet.2025.107473","DOIUrl":null,"url":null,"abstract":"<div><div>In current neurodynamic studies, memristor models using polynomial or multiple nested composite functions are primarily employed to generate multiscroll attractors, but their complex mathematical form restricts both research and application. To address this issue, without relying on polynomial and multiple nested composite functions, this study devises a unique memristor model and a memristive autapse HR (MAHR) neuron model featuring multiscroll hidden attractor. Specially, the quantity of scrolls within the multiscroll hidden attractors is regulated by simulation time. Besides, a simple control factor is incorporated into the memristor to improve the MAHR neuron model. Numerical analysis further finds that the quantity of scrolls within the multiscroll hidden attractor from the improved MAHR neuron model can be conveniently adjusted by only changing a single parameter or initial condition of the memristor. Moreover, a microcontroller-based hardware experiment is conducted to confirm that the improved MAHR neuron model is physically feasible. Finally, an elegant image encryption scheme is proposed to explore the real-world applicability of the improved MAHR neuron model.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"188 ","pages":"Article 107473"},"PeriodicalIF":6.0000,"publicationDate":"2025-04-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025003521","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

In current neurodynamic studies, memristor models using polynomial or multiple nested composite functions are primarily employed to generate multiscroll attractors, but their complex mathematical form restricts both research and application. To address this issue, without relying on polynomial and multiple nested composite functions, this study devises a unique memristor model and a memristive autapse HR (MAHR) neuron model featuring multiscroll hidden attractor. Specially, the quantity of scrolls within the multiscroll hidden attractors is regulated by simulation time. Besides, a simple control factor is incorporated into the memristor to improve the MAHR neuron model. Numerical analysis further finds that the quantity of scrolls within the multiscroll hidden attractor from the improved MAHR neuron model can be conveniently adjusted by only changing a single parameter or initial condition of the memristor. Moreover, a microcontroller-based hardware experiment is conducted to confirm that the improved MAHR neuron model is physically feasible. Finally, an elegant image encryption scheme is proposed to explore the real-world applicability of the improved MAHR neuron model.
记忆神经元模型中的多滚动隐藏吸引子及其基于记忆的滚动控制及其在图像加密中的应用
在目前的神经动力学研究中,多涡旋吸引子的产生主要采用多项式或多个嵌套复合函数的忆阻器模型,但其复杂的数学形式限制了其研究和应用。为了解决这一问题,本研究在不依赖多项式和多个嵌套复合函数的情况下,设计了一种独特的忆阻器模型和一种具有多滚动隐藏吸引子的忆阻自闭HR (MAHR)神经元模型。特别地,多卷轴隐藏吸引子内卷轴的数量由仿真时间来调节。此外,在记忆电阻器中加入一个简单的控制因子来改进MAHR神经元模型。数值分析进一步发现,改进的MAHR神经元模型可以通过改变记忆电阻的单个参数或初始条件来方便地调整多卷轴隐藏吸引子内的卷轴数量。此外,基于微控制器的硬件实验证实了改进的MAHR神经元模型在物理上是可行的。最后,提出了一种优雅的图像加密方案,以探索改进的MAHR神经元模型在现实世界中的适用性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Neural Networks
Neural Networks 工程技术-计算机:人工智能
CiteScore
13.90
自引率
7.70%
发文量
425
审稿时长
67 days
期刊介绍: Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信