Spike-Aware Training and Timing Window Optimization for Energy-Efficient Inference in Conversion-Based Spiking Neural Networks

Vijaya Kumar, Suresh Balanethiram
{"title":"Spike-Aware Training and Timing Window Optimization for Energy-Efficient Inference in Conversion-Based Spiking Neural Networks","authors":"Vijaya Kumar, Suresh Balanethiram","doi":"10.1109/IConSCEPT57958.2023.10170596","DOIUrl":null,"url":null,"abstract":"Spiking Neural Networks (SNNs) are a promising alternative to traditional Deep Neural Networks (DNNs) due to their ability to operate in low-power event-driven mode. However, training SNNs from scratch remains challenging, and conversion-based SNNs derived from pre-trained DNNs have become popular. In this paper, we focus on generating learnable parameters for the inference phase by analyzing the timing window of rate-coded spiking activation using N-MNIST digit classification. We compare the training accuracy of a non-spiking ANN model with the spike ignored and spike-aware spiking activation models trained at different time intervals. We also use regularization to control the mean spike rate of neurons and include a moving-average pooling layer to improve classification accuracy. We provide insights into optimizing the timing window of rate-coded spiking activation for energy-efficient and accurate SNN inference. Our results show that spike-aware training with regularization and moving-average pooling improves convergence and achieves high accuracy. These findings can help improve the training of SNNs for various AI applications.","PeriodicalId":240167,"journal":{"name":"2023 International Conference on Signal Processing, Computation, Electronics, Power and Telecommunication (IConSCEPT)","volume":"120 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Conference on Signal Processing, Computation, Electronics, Power and Telecommunication (IConSCEPT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IConSCEPT57958.2023.10170596","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Spiking Neural Networks (SNNs) are a promising alternative to traditional Deep Neural Networks (DNNs) due to their ability to operate in low-power event-driven mode. However, training SNNs from scratch remains challenging, and conversion-based SNNs derived from pre-trained DNNs have become popular. In this paper, we focus on generating learnable parameters for the inference phase by analyzing the timing window of rate-coded spiking activation using N-MNIST digit classification. We compare the training accuracy of a non-spiking ANN model with the spike ignored and spike-aware spiking activation models trained at different time intervals. We also use regularization to control the mean spike rate of neurons and include a moving-average pooling layer to improve classification accuracy. We provide insights into optimizing the timing window of rate-coded spiking activation for energy-efficient and accurate SNN inference. Our results show that spike-aware training with regularization and moving-average pooling improves convergence and achieves high accuracy. These findings can help improve the training of SNNs for various AI applications.
基于转换的尖峰神经网络中节能推理的尖峰感知训练和时间窗优化
由于脉冲神经网络(snn)能够在低功耗事件驱动模式下运行,因此它是传统深度神经网络(dnn)的一个很有前途的替代品。然而,从头开始训练snn仍然具有挑战性,从预训练的dnn衍生的基于转换的snn已经变得流行。在本文中,我们主要通过分析使用N-MNIST数字分类的速率编码尖峰激活的时间窗口来生成推理阶段的可学习参数。我们比较了在不同时间间隔训练的非尖峰神经网络模型与忽略尖峰和感知尖峰激活模型的训练精度。我们还使用正则化来控制神经元的平均尖峰率,并包括一个移动平均池化层来提高分类精度。我们为优化速率编码尖峰激活的时间窗口以实现节能和准确的SNN推断提供了见解。结果表明,采用正则化和移动平均池化的尖峰感知训练提高了收敛性,达到了较高的准确率。这些发现可以帮助改进snn在各种人工智能应用中的训练。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信