{"title":"Robust Spatiotemporal Prototype Learning for Spiking Neural Networks.","authors":"Wuque Cai, Hongze Sun, Qianqian Liao, Jiayi He, Duo Chen, Dezhong Yao, Daqing Guo","doi":"10.1109/TNNLS.2025.3583747","DOIUrl":null,"url":null,"abstract":"<p><p>Spiking neural networks (SNNs) leverage their spike-driven nature to achieve high energy efficiency, positioning them as a promising alternative to traditional artificial neural networks (ANNs). The spiking decoder, a crucial component for output, significantly affects the performance of SNNs. However, current rate coding schemes for decoding of SNNs often lack robustness and do not have a training framework suitable for robust learning, while alternatives to rate coding generally produce worse overall performance. To address these challenges, we propose spatiotemporal prototype (STP) learning for SNNs, which uses multiple learnable binarized prototypes for distance-based decoding. In addition, we introduce a cotraining framework that jointly optimizes prototypes and model parameters, enabling mutual adaptation of the two components. STP learning clusters feature centers through supervised learning to ensure effective aggregation around the prototypes, while maintaining enough spacing between prototypes to handle noise and interference. This dual capability results in superior stability and robustness. On eight benchmark datasets with diverse challenges, the STP-SNN model achieves performance comparable to or superior to state-of-the-art methods. Notably, STP learning demonstrates exceptional robustness and stability in multitask experiments. Overall, these findings reveal that STP learning is an effective means of improving the performance and robustness of SNNs.</p>","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"PP ","pages":"18995-19009"},"PeriodicalIF":8.9000,"publicationDate":"2025-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1109/TNNLS.2025.3583747","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Spiking neural networks (SNNs) leverage their spike-driven nature to achieve high energy efficiency, positioning them as a promising alternative to traditional artificial neural networks (ANNs). The spiking decoder, a crucial component for output, significantly affects the performance of SNNs. However, current rate coding schemes for decoding of SNNs often lack robustness and do not have a training framework suitable for robust learning, while alternatives to rate coding generally produce worse overall performance. To address these challenges, we propose spatiotemporal prototype (STP) learning for SNNs, which uses multiple learnable binarized prototypes for distance-based decoding. In addition, we introduce a cotraining framework that jointly optimizes prototypes and model parameters, enabling mutual adaptation of the two components. STP learning clusters feature centers through supervised learning to ensure effective aggregation around the prototypes, while maintaining enough spacing between prototypes to handle noise and interference. This dual capability results in superior stability and robustness. On eight benchmark datasets with diverse challenges, the STP-SNN model achieves performance comparable to or superior to state-of-the-art methods. Notably, STP learning demonstrates exceptional robustness and stability in multitask experiments. Overall, these findings reveal that STP learning is an effective means of improving the performance and robustness of SNNs.
期刊介绍:
The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.