Time-Reduced Model for Multilayer Spiking Neural Networks

Yanjing Li
{"title":"Time-Reduced Model for Multilayer Spiking Neural Networks","authors":"Yanjing Li","doi":"10.11648/j.ijse.20230701.11","DOIUrl":null,"url":null,"abstract":": Spiking neural networks (SNNs) is a type of biological neural network model, which is more biologically plausible and computationally powerful than traditional artificial neural networks (ANNs). SNNs can achieve the same goals as ANNs, and can build a large-scale network structure (i.e. deep spiking neural network) to accomplish complex tasks. However, training deep spiking neural network is difficult due to the non-differentiable nature of spike events, and it requires much computation time during the training period. In this paper, a time-reduced model adopting two methods is presented for reducing the computation time of a deep spiking neural network (i.e. approximating the spike response function by the piecewise linear method, and choosing the suitable number of sub-synapses). The experimental results show that the methods of piecewise linear approximation and choosing the suitable number of sub-synapses is effective. This method can not only reduce the training time but also simplify the network structure. With the piecewise linear approximation method, the half of computation time of the original model can be reduced by at least. With the rule of choosing the number of sub-synapses, the computation time of less than one-tenth of the original model can be reduced for XOR and Iris tasks.","PeriodicalId":14477,"journal":{"name":"International Journal of Systems Engineering","volume":" 40","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-02-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Systems Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.11648/j.ijse.20230701.11","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

: Spiking neural networks (SNNs) is a type of biological neural network model, which is more biologically plausible and computationally powerful than traditional artificial neural networks (ANNs). SNNs can achieve the same goals as ANNs, and can build a large-scale network structure (i.e. deep spiking neural network) to accomplish complex tasks. However, training deep spiking neural network is difficult due to the non-differentiable nature of spike events, and it requires much computation time during the training period. In this paper, a time-reduced model adopting two methods is presented for reducing the computation time of a deep spiking neural network (i.e. approximating the spike response function by the piecewise linear method, and choosing the suitable number of sub-synapses). The experimental results show that the methods of piecewise linear approximation and choosing the suitable number of sub-synapses is effective. This method can not only reduce the training time but also simplify the network structure. With the piecewise linear approximation method, the half of computation time of the original model can be reduced by at least. With the rule of choosing the number of sub-synapses, the computation time of less than one-tenth of the original model can be reduced for XOR and Iris tasks.
多层脉冲神经网络的时间缩减模型
:峰值神经网络(SNNs)是一种生物神经网络模型,它比传统的人工神经网络(ann)具有更强的生物学合理性和计算能力。snn可以实现与ann相同的目标,并且可以构建大规模的网络结构(即深度尖峰神经网络)来完成复杂的任务。然而,由于脉冲事件的不可微性,训练深度脉冲神经网络是一个困难的问题,并且在训练期间需要大量的计算时间。本文提出了一种采用两种方法的减时模型来减少深度尖峰神经网络的计算时间(即用分段线性方法逼近尖峰响应函数,并选择合适的子突触数量)。实验结果表明,分段线性逼近和选择合适的子突触数目的方法是有效的。该方法不仅减少了训练时间,而且简化了网络结构。采用分段线性逼近法,可使原模型的计算时间至少减少一半。通过选择子突触数量的规则,可以将异或和虹膜任务的计算时间减少到原模型的十分之一以下。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信