A Feed-Forward Neural Network for Increasing the Hopfield-Network Storage Capacity

IF 6.6 2区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Shaokai Zhao, Bin Chen, Hui Wang, Zhiyuan Luo, Zhang Tao
{"title":"A Feed-Forward Neural Network for Increasing the Hopfield-Network Storage Capacity","authors":"Shaokai Zhao, Bin Chen, Hui Wang, Zhiyuan Luo, Zhang Tao","doi":"10.1142/S0129065722500277","DOIUrl":null,"url":null,"abstract":"In the hippocampal dentate gyrus (DG), pattern separation mainly depends on the concepts of 'expansion recoding', meaning random mixing of different DG input channels. However, recent advances in neurophysiology have challenged the theory of pattern separation based on these concepts. In this study, we propose a novel feed-forward neural network, inspired by the structure of the DG and neural oscillatory analysis, to increase the Hopfield-network storage capacity. Unlike the previously published feed-forward neural networks, our bio-inspired neural network is designed to take advantage of both biological structure and functions of the DG. To better understand the computational principles of pattern separation in the DG, we have established a mouse model of environmental enrichment. We obtained a possible computational model of the DG, associated with better pattern separation ability, by using neural oscillatory analysis. Furthermore, we have developed a new algorithm based on Hebbian learning and coupling direction of neural oscillation to train the proposed neural network. The simulation results show that our proposed network significantly expands the storage capacity of Hopfield network, and more effective pattern separation is achieved. The storage capacity rises from 0.13 for the standard Hopfield network to 0.32 using our model when the overlap in patterns is 10%.","PeriodicalId":50305,"journal":{"name":"International Journal of Neural Systems","volume":"1 1","pages":"2250027"},"PeriodicalIF":6.6000,"publicationDate":"2022-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Neural Systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1142/S0129065722500277","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 1

Abstract

In the hippocampal dentate gyrus (DG), pattern separation mainly depends on the concepts of 'expansion recoding', meaning random mixing of different DG input channels. However, recent advances in neurophysiology have challenged the theory of pattern separation based on these concepts. In this study, we propose a novel feed-forward neural network, inspired by the structure of the DG and neural oscillatory analysis, to increase the Hopfield-network storage capacity. Unlike the previously published feed-forward neural networks, our bio-inspired neural network is designed to take advantage of both biological structure and functions of the DG. To better understand the computational principles of pattern separation in the DG, we have established a mouse model of environmental enrichment. We obtained a possible computational model of the DG, associated with better pattern separation ability, by using neural oscillatory analysis. Furthermore, we have developed a new algorithm based on Hebbian learning and coupling direction of neural oscillation to train the proposed neural network. The simulation results show that our proposed network significantly expands the storage capacity of Hopfield network, and more effective pattern separation is achieved. The storage capacity rises from 0.13 for the standard Hopfield network to 0.32 using our model when the overlap in patterns is 10%.
一种用于提高Hopfield网络存储容量的前馈神经网络
在海马齿状回(DG),模式分离主要取决于“扩展-记录”的概念,即不同DG输入通道的随机混合。然而,神经生理学的最新进展对基于这些概念的模式分离理论提出了挑战。在本研究中,我们受到DG结构和神经振荡分析的启发,提出了一种新的前馈神经网络,以增加Hopfield网络的存储容量。与之前发表的前馈神经网络不同,我们的仿生神经网络旨在利用DG的生物结构和功能。为了更好地理解DG中模式分离的计算原理,我们建立了一个环境富集的小鼠模型。通过使用神经振荡分析,我们获得了一个可能的DG计算模型,该模型具有更好的模式分离能力。此外,我们还开发了一种基于Hebbian学习和神经振荡耦合方向的新算法来训练所提出的神经网络。仿真结果表明,我们提出的网络显著扩展了Hopfield网络的存储容量,并实现了更有效的模式分离。当模式重叠为10%时,使用我们的模型,存储容量从标准Hopfield网络的0.13上升到0.32。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
International Journal of Neural Systems
International Journal of Neural Systems 工程技术-计算机:人工智能
CiteScore
11.30
自引率
28.80%
发文量
116
审稿时长
24 months
期刊介绍: The International Journal of Neural Systems is a monthly, rigorously peer-reviewed transdisciplinary journal focusing on information processing in both natural and artificial neural systems. Special interests include machine learning, computational neuroscience and neurology. The journal prioritizes innovative, high-impact articles spanning multiple fields, including neurosciences and computer science and engineering. It adopts an open-minded approach to this multidisciplinary field, serving as a platform for novel ideas and enhanced understanding of collective and cooperative phenomena in computationally capable systems.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信