IF 10.2 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Malu Zhang, Xiaoling Luo, Jibin Wu, Ammar Belatreche, Siqi Cai, Yang Yang, Haizhou Li
{"title":"Toward Building Human-Like Sequential Memory Using Brain-Inspired Spiking Neural Models.","authors":"Malu Zhang, Xiaoling Luo, Jibin Wu, Ammar Belatreche, Siqi Cai, Yang Yang, Haizhou Li","doi":"10.1109/TNNLS.2025.3543673","DOIUrl":null,"url":null,"abstract":"<p><p>The brain is able to acquire and store memories of everyday experiences in real-time. It can also selectively forget information to facilitate memory updating. However, our understanding of the underlying mechanisms and coordination of these processes within the brain remains limited. However, no existing artificial intelligence models have yet matched human-level capabilities in terms of memory storage and retrieval. This study introduces a brain-inspired spiking neural model that integrates the learning and forgetting processes of sequential memory. The proposed model closely mimics the distributed and sparse temporal coding observed in the biological neural system. It employs one-shot online learning for memory formation and uses biologically plausible mechanisms of neural oscillation and phase precession to retrieve memorized sequences reliably. In addition, an active forgetting mechanism is integrated into the spiking neural model, enabling memory removal, flexibility, and updating. The proposed memory model not only enhances our understanding of human memory processes but also provides a robust framework for addressing temporal modeling tasks.</p>","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"PP ","pages":""},"PeriodicalIF":10.2000,"publicationDate":"2025-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1109/TNNLS.2025.3543673","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

大脑能够实时获取和存储日常经验的记忆。它还能选择性地遗忘信息,以促进记忆更新。然而,我们对大脑内这些过程的基本机制和协调的了解仍然有限。然而,现有的人工智能模型在记忆存储和检索方面还没有达到人类水平的能力。本研究介绍了一种大脑启发的尖峰神经模型,该模型整合了顺序记忆的学习和遗忘过程。该模型密切模仿了在生物神经系统中观察到的分布式稀疏时间编码。它采用单次在线学习来形成记忆,并利用生物学上可信的神经振荡和相位前冲机制来可靠地检索记忆序列。此外,在尖峰神经模型中还集成了主动遗忘机制,从而实现了记忆的删除、灵活性和更新。所提出的记忆模型不仅增强了我们对人类记忆过程的理解,还为解决时间建模任务提供了一个强大的框架。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Toward Building Human-Like Sequential Memory Using Brain-Inspired Spiking Neural Models.

The brain is able to acquire and store memories of everyday experiences in real-time. It can also selectively forget information to facilitate memory updating. However, our understanding of the underlying mechanisms and coordination of these processes within the brain remains limited. However, no existing artificial intelligence models have yet matched human-level capabilities in terms of memory storage and retrieval. This study introduces a brain-inspired spiking neural model that integrates the learning and forgetting processes of sequential memory. The proposed model closely mimics the distributed and sparse temporal coding observed in the biological neural system. It employs one-shot online learning for memory formation and uses biologically plausible mechanisms of neural oscillation and phase precession to retrieve memorized sequences reliably. In addition, an active forgetting mechanism is integrated into the spiking neural model, enabling memory removal, flexibility, and updating. The proposed memory model not only enhances our understanding of human memory processes but also provides a robust framework for addressing temporal modeling tasks.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE transactions on neural networks and learning systems
IEEE transactions on neural networks and learning systems COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE-COMPUTER SCIENCE, HARDWARE & ARCHITECTURE
CiteScore
23.80
自引率
9.60%
发文量
2102
审稿时长
3-8 weeks
期刊介绍: The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信