{"title":"卷积神经网络到每个神经元少于一个尖峰的尖峰神经网络的转换","authors":"Javier Lopez-Randulfe, Nico Reeb, Alois Knoll","doi":"10.32470/ccn.2022.1081-0","DOIUrl":null,"url":null,"abstract":"Spiking neural networks can leverage the high efficiency of temporal coding by converting architectures that were previously learnt with the backpropagation algorithm. In this work, we present the application of a time-coded neuron model for the conversion of classic artificial neural networks that re-duces the computational complexity in the synaptic connections. By adapting the ReLU activation function, the network achieved a sparsity of 0.142 spikes per neuron. The classifi-cation of handwritten digits from the MNIST dataset show that the neuron model is able to convert convolutional neural networks with several hidden layers.","PeriodicalId":341186,"journal":{"name":"2022 Conference on Cognitive Computational Neuroscience","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Conversion of ConvNets to Spiking Neural Networks With Less Than One Spike per Neuron\",\"authors\":\"Javier Lopez-Randulfe, Nico Reeb, Alois Knoll\",\"doi\":\"10.32470/ccn.2022.1081-0\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Spiking neural networks can leverage the high efficiency of temporal coding by converting architectures that were previously learnt with the backpropagation algorithm. In this work, we present the application of a time-coded neuron model for the conversion of classic artificial neural networks that re-duces the computational complexity in the synaptic connections. By adapting the ReLU activation function, the network achieved a sparsity of 0.142 spikes per neuron. The classifi-cation of handwritten digits from the MNIST dataset show that the neuron model is able to convert convolutional neural networks with several hidden layers.\",\"PeriodicalId\":341186,\"journal\":{\"name\":\"2022 Conference on Cognitive Computational Neuroscience\",\"volume\":\"30 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1900-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 Conference on Cognitive Computational Neuroscience\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.32470/ccn.2022.1081-0\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 Conference on Cognitive Computational Neuroscience","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.32470/ccn.2022.1081-0","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Conversion of ConvNets to Spiking Neural Networks With Less Than One Spike per Neuron
Spiking neural networks can leverage the high efficiency of temporal coding by converting architectures that were previously learnt with the backpropagation algorithm. In this work, we present the application of a time-coded neuron model for the conversion of classic artificial neural networks that re-duces the computational complexity in the synaptic connections. By adapting the ReLU activation function, the network achieved a sparsity of 0.142 spikes per neuron. The classifi-cation of handwritten digits from the MNIST dataset show that the neuron model is able to convert convolutional neural networks with several hidden layers.