A temporally and spatially local spike-based backpropagation algorithm to enable training in hardware

Anmol Biswas, V. Saraswat, U. Ganguly
{"title":"A temporally and spatially local spike-based backpropagation algorithm to enable training in hardware","authors":"Anmol Biswas, V. Saraswat, U. Ganguly","doi":"10.1088/2634-4386/acf1c5","DOIUrl":null,"url":null,"abstract":"Spiking neural networks (SNNs) have emerged as a hardware efficient architecture for classification tasks. The challenge of spike-based encoding has been the lack of a universal training mechanism performed entirely using spikes. There have been several attempts to adopt the powerful backpropagation (BP) technique used in non-spiking artificial neural networks (ANNs): (1) SNNs can be trained by externally computed numerical gradients. (2) A major advancement towards native spike-based learning has been the use of approximate BP using spike-time dependent plasticity with phased forward/backward passes. However, the transfer of information between such phases for gradient and weight update calculation necessitates external memory and computational access. This is a challenge for standard neuromorphic hardware implementations. In this paper, we propose a stochastic SNN based back-prop (SSNN-BP) algorithm that utilizes a composite neuron to simultaneously compute the forward pass activations and backward pass gradients explicitly with spikes. Although signed gradient values are a challenge for spike-based representation, we tackle this by splitting the gradient signal into positive and negative streams. The composite neuron encodes information in the form of stochastic spike-trains and converts BP weight updates into temporally and spatially local spike coincidence updates compatible with hardware-friendly resistive processing units. Furthermore, we characterize the quantization effect of discrete spike-based weight update to show that our method approaches BP ANN baseline with sufficiently long spike-trains. Finally, we show that the well-performing softmax cross-entropy loss function can be implemented through inhibitory lateral connections enforcing a winner take all rule. Our SNN with a two-layer network shows excellent generalization through comparable performance to ANNs with equivalent architecture and regularization parameters on static image datasets like MNIST, Fashion-MNIST, Extended MNIST, and temporally encoded image datasets like Neuromorphic MNIST datasets. Thus, SSNN-BP enables BP compatible with purely spike-based neuromorphic hardware.","PeriodicalId":198030,"journal":{"name":"Neuromorphic Computing and Engineering","volume":"31 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neuromorphic Computing and Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1088/2634-4386/acf1c5","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Spiking neural networks (SNNs) have emerged as a hardware efficient architecture for classification tasks. The challenge of spike-based encoding has been the lack of a universal training mechanism performed entirely using spikes. There have been several attempts to adopt the powerful backpropagation (BP) technique used in non-spiking artificial neural networks (ANNs): (1) SNNs can be trained by externally computed numerical gradients. (2) A major advancement towards native spike-based learning has been the use of approximate BP using spike-time dependent plasticity with phased forward/backward passes. However, the transfer of information between such phases for gradient and weight update calculation necessitates external memory and computational access. This is a challenge for standard neuromorphic hardware implementations. In this paper, we propose a stochastic SNN based back-prop (SSNN-BP) algorithm that utilizes a composite neuron to simultaneously compute the forward pass activations and backward pass gradients explicitly with spikes. Although signed gradient values are a challenge for spike-based representation, we tackle this by splitting the gradient signal into positive and negative streams. The composite neuron encodes information in the form of stochastic spike-trains and converts BP weight updates into temporally and spatially local spike coincidence updates compatible with hardware-friendly resistive processing units. Furthermore, we characterize the quantization effect of discrete spike-based weight update to show that our method approaches BP ANN baseline with sufficiently long spike-trains. Finally, we show that the well-performing softmax cross-entropy loss function can be implemented through inhibitory lateral connections enforcing a winner take all rule. Our SNN with a two-layer network shows excellent generalization through comparable performance to ANNs with equivalent architecture and regularization parameters on static image datasets like MNIST, Fashion-MNIST, Extended MNIST, and temporally encoded image datasets like Neuromorphic MNIST datasets. Thus, SSNN-BP enables BP compatible with purely spike-based neuromorphic hardware.
一种基于时间和空间局部尖峰的反向传播算法,使硬件训练成为可能
尖峰神经网络(snn)作为一种硬件高效的分类架构已经出现。基于峰值编码的挑战在于缺乏完全使用峰值执行的通用训练机制。在非尖峰人工神经网络(ANNs)中采用强大的反向传播(BP)技术已经有了一些尝试:(1)snn可以通过外部计算的数值梯度来训练。(2)原生峰值学习的一个主要进展是使用近似BP,使用峰值时间依赖的可塑性和分阶段的向前/向后传递。然而,在这些阶段之间传递梯度和权重更新计算的信息需要外部存储器和计算访问。这对标准的神经形态硬件实现来说是一个挑战。在本文中,我们提出了一种基于随机SNN的back-prop (SSNN-BP)算法,该算法利用复合神经元同时计算带尖峰的前向传递激活和后向传递梯度。尽管带符号的梯度值对于基于峰值的表示是一个挑战,我们通过将梯度信号分成正流和负流来解决这个问题。复合神经元以随机峰值序列的形式编码信息,并将BP权值更新转换为与硬件友好的电阻处理单元兼容的时间和空间局部峰值重合更新。此外,我们描述了基于离散尖峰的权重更新的量化效果,表明我们的方法接近具有足够长的尖峰序列的BP神经网络基线。最后,我们证明了性能良好的softmax交叉熵损失函数可以通过执行赢家通吃规则的抑制横向连接来实现。我们的双层网络SNN在静态图像数据集(如MNIST、Fashion-MNIST、Extended MNIST)和临时编码图像数据集(如Neuromorphic MNIST数据集)上与具有等效架构和正则化参数的ann具有相当的性能,显示出出色的泛化能力。因此,SSNN-BP使BP与纯基于峰值的神经形态硬件兼容。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
5.90
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信