Yuan Zeng, Edward Jeffs, T. Stewart, Y. Berdichevsky, Xiaochen Guo
{"title":"面向时间任务的小时间常数循环脉冲神经网络优化","authors":"Yuan Zeng, Edward Jeffs, T. Stewart, Y. Berdichevsky, Xiaochen Guo","doi":"10.1145/3546790.3546796","DOIUrl":null,"url":null,"abstract":"Recurrent spiking neural network (RSNN) is a frequently studied model to understand biological neural networks, as well as to develop energy efficient neuromorphic systems. Deep learning optimization approach, such as backpropogation through time (BPTT), equipped with surrogate gradient, can be used as an efficient optimization method for RSNN. Including dynamic properties of biological neurons into the neuron model may improve the network’s temporal learning capability. Earlier work only considers the spike frequency adaptation behavior with a large adaptation time constant that may be unsuitable for neuromorphic implementation. Besides adaptation, synapse is also an important structure for information transfer between neurons and its dynamics may influence network performance. In this work, a Leaky Integrate and Fire neuron model with dynamic synapses and spike frequency adaptation is used for temporal tasks. A step-by-step experiment is designed to understand the impact of recurrent connections, synapse model, and adaptation model on the network accuracy. For each step, a hyper-parameters tuning tool is used to find the best set of neuron parameters. In addition, the influence of the synapse and adaptation time constants is studied. Results suggest that, dynamic synapse is more efficient than adaptation in improving the network’s learning capability. When incorporating adaptation and synapse model together, the network can achieve a similar accuracy as the sate-of-the-art RSNN works while requiring fewer neurons and smaller time constants.","PeriodicalId":104528,"journal":{"name":"Proceedings of the International Conference on Neuromorphic Systems 2022","volume":"26 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Optimizing Recurrent Spiking Neural Networks with Small Time Constants for Temporal Tasks\",\"authors\":\"Yuan Zeng, Edward Jeffs, T. Stewart, Y. Berdichevsky, Xiaochen Guo\",\"doi\":\"10.1145/3546790.3546796\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Recurrent spiking neural network (RSNN) is a frequently studied model to understand biological neural networks, as well as to develop energy efficient neuromorphic systems. Deep learning optimization approach, such as backpropogation through time (BPTT), equipped with surrogate gradient, can be used as an efficient optimization method for RSNN. Including dynamic properties of biological neurons into the neuron model may improve the network’s temporal learning capability. Earlier work only considers the spike frequency adaptation behavior with a large adaptation time constant that may be unsuitable for neuromorphic implementation. Besides adaptation, synapse is also an important structure for information transfer between neurons and its dynamics may influence network performance. In this work, a Leaky Integrate and Fire neuron model with dynamic synapses and spike frequency adaptation is used for temporal tasks. A step-by-step experiment is designed to understand the impact of recurrent connections, synapse model, and adaptation model on the network accuracy. For each step, a hyper-parameters tuning tool is used to find the best set of neuron parameters. In addition, the influence of the synapse and adaptation time constants is studied. Results suggest that, dynamic synapse is more efficient than adaptation in improving the network’s learning capability. When incorporating adaptation and synapse model together, the network can achieve a similar accuracy as the sate-of-the-art RSNN works while requiring fewer neurons and smaller time constants.\",\"PeriodicalId\":104528,\"journal\":{\"name\":\"Proceedings of the International Conference on Neuromorphic Systems 2022\",\"volume\":\"26 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-07-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the International Conference on Neuromorphic Systems 2022\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3546790.3546796\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the International Conference on Neuromorphic Systems 2022","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3546790.3546796","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Optimizing Recurrent Spiking Neural Networks with Small Time Constants for Temporal Tasks
Recurrent spiking neural network (RSNN) is a frequently studied model to understand biological neural networks, as well as to develop energy efficient neuromorphic systems. Deep learning optimization approach, such as backpropogation through time (BPTT), equipped with surrogate gradient, can be used as an efficient optimization method for RSNN. Including dynamic properties of biological neurons into the neuron model may improve the network’s temporal learning capability. Earlier work only considers the spike frequency adaptation behavior with a large adaptation time constant that may be unsuitable for neuromorphic implementation. Besides adaptation, synapse is also an important structure for information transfer between neurons and its dynamics may influence network performance. In this work, a Leaky Integrate and Fire neuron model with dynamic synapses and spike frequency adaptation is used for temporal tasks. A step-by-step experiment is designed to understand the impact of recurrent connections, synapse model, and adaptation model on the network accuracy. For each step, a hyper-parameters tuning tool is used to find the best set of neuron parameters. In addition, the influence of the synapse and adaptation time constants is studied. Results suggest that, dynamic synapse is more efficient than adaptation in improving the network’s learning capability. When incorporating adaptation and synapse model together, the network can achieve a similar accuracy as the sate-of-the-art RSNN works while requiring fewer neurons and smaller time constants.