尖峰神经网络中的 Memristive 渗漏整合发射神经元和可学习直通估计器

IF 3.1 3区 工程技术 Q2 NEUROSCIENCES
Tao Chen, Chunyan She, Lidan Wang, Shukai Duan
{"title":"尖峰神经网络中的 Memristive 渗漏整合发射神经元和可学习直通估计器","authors":"Tao Chen, Chunyan She, Lidan Wang, Shukai Duan","doi":"10.1007/s11571-024-10133-w","DOIUrl":null,"url":null,"abstract":"<p>Compared to artificial neural networks (ANNs), spiking neural networks (SNNs) present a more biologically plausible model of neural system dynamics. They rely on sparse binary spikes to communicate information and operate in an asynchronous, event-driven manner. Despite the high heterogeneity of the neural system at the neuronal level, most current SNNs employ the widely used leaky integrate-and-fire (LIF) neuron model, which assumes uniform membrane-related parameters throughout the entire network. This approach hampers the expressiveness of spiking neurons and restricts the diversity of neural dynamics. In this paper, we propose replacing the resistor in the LIF model with a discrete memristor to obtain the heterogeneous memristive LIF (MLIF) model. The memristance of the discrete memristor is determined by the voltage and flux at its terminals, leading to dynamic changes in the membrane time parameter of the MLIF model. SNNs composed of MLIF neurons can not only learn synaptic weights but also adaptively change membrane time parameters according to the membrane potential of the neuron, enhancing the learning ability and expression of SNNs. Furthermore, since the proper threshold of spiking neurons can improve the information capacity of SNNs, a learnable straight-through estimator (LSTE) is proposed. The LSTE, based on the straight-through estimator (STE) surrogate function, features a learnable threshold that facilitates the backward propagation of gradients through neurons firing spikes. Extensive experiments on several popular static and neuromorphic benchmark datasets demonstrate the effectiveness of the proposed MLIF and LSTE, especially on the DVS-CIFAR10 dataset, where we achieved the top-1 accuracy of 84.40<span>\\(\\%\\)</span>.</p>","PeriodicalId":10500,"journal":{"name":"Cognitive Neurodynamics","volume":null,"pages":null},"PeriodicalIF":3.1000,"publicationDate":"2024-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Memristive leaky integrate-and-fire neuron and learnable straight-through estimator in spiking neural networks\",\"authors\":\"Tao Chen, Chunyan She, Lidan Wang, Shukai Duan\",\"doi\":\"10.1007/s11571-024-10133-w\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Compared to artificial neural networks (ANNs), spiking neural networks (SNNs) present a more biologically plausible model of neural system dynamics. They rely on sparse binary spikes to communicate information and operate in an asynchronous, event-driven manner. Despite the high heterogeneity of the neural system at the neuronal level, most current SNNs employ the widely used leaky integrate-and-fire (LIF) neuron model, which assumes uniform membrane-related parameters throughout the entire network. This approach hampers the expressiveness of spiking neurons and restricts the diversity of neural dynamics. In this paper, we propose replacing the resistor in the LIF model with a discrete memristor to obtain the heterogeneous memristive LIF (MLIF) model. The memristance of the discrete memristor is determined by the voltage and flux at its terminals, leading to dynamic changes in the membrane time parameter of the MLIF model. SNNs composed of MLIF neurons can not only learn synaptic weights but also adaptively change membrane time parameters according to the membrane potential of the neuron, enhancing the learning ability and expression of SNNs. Furthermore, since the proper threshold of spiking neurons can improve the information capacity of SNNs, a learnable straight-through estimator (LSTE) is proposed. The LSTE, based on the straight-through estimator (STE) surrogate function, features a learnable threshold that facilitates the backward propagation of gradients through neurons firing spikes. Extensive experiments on several popular static and neuromorphic benchmark datasets demonstrate the effectiveness of the proposed MLIF and LSTE, especially on the DVS-CIFAR10 dataset, where we achieved the top-1 accuracy of 84.40<span>\\\\(\\\\%\\\\)</span>.</p>\",\"PeriodicalId\":10500,\"journal\":{\"name\":\"Cognitive Neurodynamics\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":3.1000,\"publicationDate\":\"2024-06-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Cognitive Neurodynamics\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1007/s11571-024-10133-w\",\"RegionNum\":3,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"NEUROSCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cognitive Neurodynamics","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1007/s11571-024-10133-w","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"NEUROSCIENCES","Score":null,"Total":0}
引用次数: 0

摘要

与人工神经网络(ANN)相比,尖峰神经网络(SNN)提出了一种更符合生物学原理的神经系统动力学模型。它们依靠稀疏的二进制尖峰来传递信息,并以异步、事件驱动的方式运行。尽管神经系统在神经元层面具有高度异质性,但目前大多数 SNNs 采用的是广泛使用的泄漏整合-发射(LIF)神经元模型,该模型假定整个网络中与膜相关的参数是统一的。这种方法阻碍了尖峰神经元的表现力,限制了神经动态的多样性。在本文中,我们建议用离散忆阻器取代 LIF 模型中的电阻器,从而获得异质忆阻器 LIF(MLIF)模型。离散忆阻器的忆阻值由其终端的电压和通量决定,从而导致 MLIF 模型的膜时间参数发生动态变化。由 MLIF 神经元组成的 SNN 不仅能学习突触权重,还能根据神经元的膜电位自适应地改变膜时间参数,从而增强 SNN 的学习能力和表达能力。此外,由于尖峰神经元的适当阈值可以提高 SNN 的信息容量,因此提出了一种可学习的直通估计器(LSTE)。LSTE 基于直通估计器(STE)代理函数,具有可学习阈值,有利于梯度通过发射尖峰的神经元向后传播。在几个流行的静态和神经形态基准数据集上进行的广泛实验证明了所提出的MLIF和LSTE的有效性,尤其是在DVS-CIFAR10数据集上,我们取得了84.40\(\%\)的top-1准确率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Memristive leaky integrate-and-fire neuron and learnable straight-through estimator in spiking neural networks

Memristive leaky integrate-and-fire neuron and learnable straight-through estimator in spiking neural networks

Compared to artificial neural networks (ANNs), spiking neural networks (SNNs) present a more biologically plausible model of neural system dynamics. They rely on sparse binary spikes to communicate information and operate in an asynchronous, event-driven manner. Despite the high heterogeneity of the neural system at the neuronal level, most current SNNs employ the widely used leaky integrate-and-fire (LIF) neuron model, which assumes uniform membrane-related parameters throughout the entire network. This approach hampers the expressiveness of spiking neurons and restricts the diversity of neural dynamics. In this paper, we propose replacing the resistor in the LIF model with a discrete memristor to obtain the heterogeneous memristive LIF (MLIF) model. The memristance of the discrete memristor is determined by the voltage and flux at its terminals, leading to dynamic changes in the membrane time parameter of the MLIF model. SNNs composed of MLIF neurons can not only learn synaptic weights but also adaptively change membrane time parameters according to the membrane potential of the neuron, enhancing the learning ability and expression of SNNs. Furthermore, since the proper threshold of spiking neurons can improve the information capacity of SNNs, a learnable straight-through estimator (LSTE) is proposed. The LSTE, based on the straight-through estimator (STE) surrogate function, features a learnable threshold that facilitates the backward propagation of gradients through neurons firing spikes. Extensive experiments on several popular static and neuromorphic benchmark datasets demonstrate the effectiveness of the proposed MLIF and LSTE, especially on the DVS-CIFAR10 dataset, where we achieved the top-1 accuracy of 84.40\(\%\).

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Cognitive Neurodynamics
Cognitive Neurodynamics 医学-神经科学
CiteScore
6.90
自引率
18.90%
发文量
140
审稿时长
12 months
期刊介绍: Cognitive Neurodynamics provides a unique forum of communication and cooperation for scientists and engineers working in the field of cognitive neurodynamics, intelligent science and applications, bridging the gap between theory and application, without any preference for pure theoretical, experimental or computational models. The emphasis is to publish original models of cognitive neurodynamics, novel computational theories and experimental results. In particular, intelligent science inspired by cognitive neuroscience and neurodynamics is also very welcome. The scope of Cognitive Neurodynamics covers cognitive neuroscience, neural computation based on dynamics, computer science, intelligent science as well as their interdisciplinary applications in the natural and engineering sciences. Papers that are appropriate for non-specialist readers are encouraged. 1. There is no page limit for manuscripts submitted to Cognitive Neurodynamics. Research papers should clearly represent an important advance of especially broad interest to researchers and technologists in neuroscience, biophysics, BCI, neural computer and intelligent robotics. 2. Cognitive Neurodynamics also welcomes brief communications: short papers reporting results that are of genuinely broad interest but that for one reason and another do not make a sufficiently complete story to justify a full article publication. Brief Communications should consist of approximately four manuscript pages. 3. Cognitive Neurodynamics publishes review articles in which a specific field is reviewed through an exhaustive literature survey. There are no restrictions on the number of pages. Review articles are usually invited, but submitted reviews will also be considered.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信