深度尖峰神经网络的时间依赖局部学习

Chenxiang Ma, Junhai Xu, Qiang Yu
{"title":"深度尖峰神经网络的时间依赖局部学习","authors":"Chenxiang Ma, Junhai Xu, Qiang Yu","doi":"10.1109/IJCNN52387.2021.9534390","DOIUrl":null,"url":null,"abstract":"Spiking neural networks (SNNs) are promising to replicate the efficiency of the brain by utilizing a paradigm of spike-based computation. Training a deep SNN is of great importance for solving practical tasks as well as discovering the fascinating capability of spike-based computation. The biologically plausible scheme of local learning motivates many approaches that enable training deep networks in an efficient parallel way. However, most of the existing spike-based local learning approaches show relatively low performances on challenging tasks. In this paper, we propose a new spike-based temporal dependent local learning (TDLL) algorithm, where each hidden layer of a deep SNN is independently trained with an auxiliary trainable spiking projection layer, and temporal dependency is fully employed to construct local errors for adjusting parameters. We examine the performance of the proposed TDLL with various networks on the MNIST, Fashion-MNIST, SVHN and CIFAR-10 datasets. Experimental results highlight that our method can scale up to larger networks, and more importantly, achieves relatively high accuracies on all benchmarks, which are even competitive with the ones obtained by global backpropagation-based methods. This work therefore contributes to providing an effective and efficient local learning method for deep SNNs, which could greatly benefit the developments of distributed neuromorphic computing.","PeriodicalId":396583,"journal":{"name":"2021 International Joint Conference on Neural Networks (IJCNN)","volume":"305 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"Temporal Dependent Local Learning for Deep Spiking Neural Networks\",\"authors\":\"Chenxiang Ma, Junhai Xu, Qiang Yu\",\"doi\":\"10.1109/IJCNN52387.2021.9534390\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Spiking neural networks (SNNs) are promising to replicate the efficiency of the brain by utilizing a paradigm of spike-based computation. Training a deep SNN is of great importance for solving practical tasks as well as discovering the fascinating capability of spike-based computation. The biologically plausible scheme of local learning motivates many approaches that enable training deep networks in an efficient parallel way. However, most of the existing spike-based local learning approaches show relatively low performances on challenging tasks. In this paper, we propose a new spike-based temporal dependent local learning (TDLL) algorithm, where each hidden layer of a deep SNN is independently trained with an auxiliary trainable spiking projection layer, and temporal dependency is fully employed to construct local errors for adjusting parameters. We examine the performance of the proposed TDLL with various networks on the MNIST, Fashion-MNIST, SVHN and CIFAR-10 datasets. Experimental results highlight that our method can scale up to larger networks, and more importantly, achieves relatively high accuracies on all benchmarks, which are even competitive with the ones obtained by global backpropagation-based methods. This work therefore contributes to providing an effective and efficient local learning method for deep SNNs, which could greatly benefit the developments of distributed neuromorphic computing.\",\"PeriodicalId\":396583,\"journal\":{\"name\":\"2021 International Joint Conference on Neural Networks (IJCNN)\",\"volume\":\"305 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-07-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 International Joint Conference on Neural Networks (IJCNN)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN52387.2021.9534390\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 International Joint Conference on Neural Networks (IJCNN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN52387.2021.9534390","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10

摘要

尖峰神经网络(SNNs)有望通过利用基于尖峰的计算范式来复制大脑的效率。训练深度SNN对于解决实际任务以及发现基于峰值计算的迷人能力非常重要。生物学上合理的局部学习方案激发了许多方法,使深度网络能够以有效的并行方式训练。然而,大多数现有的基于峰值的局部学习方法在具有挑战性的任务上表现出相对较低的性能。在本文中,我们提出了一种新的基于峰值的时间依赖局部学习(TDLL)算法,该算法使用辅助可训练的峰值投影层独立训练深度SNN的每个隐藏层,并充分利用时间依赖性构造局部误差来调整参数。我们在MNIST、Fashion-MNIST、SVHN和CIFAR-10数据集上测试了所提出的TDLL与各种网络的性能。实验结果表明,我们的方法可以扩展到更大的网络,更重要的是,在所有基准测试中都取得了相对较高的精度,甚至可以与基于全局反向传播的方法获得的精度相媲美。因此,这项工作有助于为深度snn提供一种有效和高效的局部学习方法,这将极大地促进分布式神经形态计算的发展。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Temporal Dependent Local Learning for Deep Spiking Neural Networks
Spiking neural networks (SNNs) are promising to replicate the efficiency of the brain by utilizing a paradigm of spike-based computation. Training a deep SNN is of great importance for solving practical tasks as well as discovering the fascinating capability of spike-based computation. The biologically plausible scheme of local learning motivates many approaches that enable training deep networks in an efficient parallel way. However, most of the existing spike-based local learning approaches show relatively low performances on challenging tasks. In this paper, we propose a new spike-based temporal dependent local learning (TDLL) algorithm, where each hidden layer of a deep SNN is independently trained with an auxiliary trainable spiking projection layer, and temporal dependency is fully employed to construct local errors for adjusting parameters. We examine the performance of the proposed TDLL with various networks on the MNIST, Fashion-MNIST, SVHN and CIFAR-10 datasets. Experimental results highlight that our method can scale up to larger networks, and more importantly, achieves relatively high accuracies on all benchmarks, which are even competitive with the ones obtained by global backpropagation-based methods. This work therefore contributes to providing an effective and efficient local learning method for deep SNNs, which could greatly benefit the developments of distributed neuromorphic computing.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信