Dylan Adams, Magda Zajaczkowska, Ashiq Anjum, Andrea Soltoggio, Shirin Dora
{"title":"Synaptic Modulation using Interspike Intervals Increases Energy Efficiency of Spiking Neural Networks","authors":"Dylan Adams, Magda Zajaczkowska, Ashiq Anjum, Andrea Soltoggio, Shirin Dora","doi":"arxiv-2408.02961","DOIUrl":null,"url":null,"abstract":"Despite basic differences between Spiking Neural Networks (SNN) and\nArtificial Neural Networks (ANN), most research on SNNs involve adapting\nANN-based methods for SNNs. Pruning (dropping connections) and quantization\n(reducing precision) are often used to improve energy efficiency of SNNs. These\nmethods are very effective for ANNs whose energy needs are determined by\nsignals transmitted on synapses. However, the event-driven paradigm in SNNs\nimplies that energy is consumed by spikes. In this paper, we propose a new\nsynapse model whose weights are modulated by Interspike Intervals (ISI) i.e.\ntime difference between two spikes. SNNs composed of this synapse model, termed\nISI Modulated SNNs (IMSNN), can use gradient descent to estimate how the ISI of\na neuron changes after updating its synaptic parameters. A higher ISI implies\nfewer spikes and vice-versa. The learning algorithm for IMSNNs exploits this\ninformation to selectively propagate gradients such that learning is achieved\nby increasing the ISIs resulting in a network that generates fewer spikes. The\nperformance of IMSNNs with dense and convolutional layers have been evaluated\nin terms of classification accuracy and the number of spikes using the MNIST\nand FashionMNIST datasets. The performance comparison with conventional SNNs\nshows that IMSNNs exhibit upto 90% reduction in the number of spikes while\nmaintaining similar classification accuracy.","PeriodicalId":501347,"journal":{"name":"arXiv - CS - Neural and Evolutionary Computing","volume":"137 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-08-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Neural and Evolutionary Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2408.02961","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Despite basic differences between Spiking Neural Networks (SNN) and
Artificial Neural Networks (ANN), most research on SNNs involve adapting
ANN-based methods for SNNs. Pruning (dropping connections) and quantization
(reducing precision) are often used to improve energy efficiency of SNNs. These
methods are very effective for ANNs whose energy needs are determined by
signals transmitted on synapses. However, the event-driven paradigm in SNNs
implies that energy is consumed by spikes. In this paper, we propose a new
synapse model whose weights are modulated by Interspike Intervals (ISI) i.e.
time difference between two spikes. SNNs composed of this synapse model, termed
ISI Modulated SNNs (IMSNN), can use gradient descent to estimate how the ISI of
a neuron changes after updating its synaptic parameters. A higher ISI implies
fewer spikes and vice-versa. The learning algorithm for IMSNNs exploits this
information to selectively propagate gradients such that learning is achieved
by increasing the ISIs resulting in a network that generates fewer spikes. The
performance of IMSNNs with dense and convolutional layers have been evaluated
in terms of classification accuracy and the number of spikes using the MNIST
and FashionMNIST datasets. The performance comparison with conventional SNNs
shows that IMSNNs exhibit upto 90% reduction in the number of spikes while
maintaining similar classification accuracy.