{"title":"用时间换空间的方法减小尖峰卷积神经网络的大小","authors":"J. Plank, Jiajia Zhao, Brent Hurst","doi":"10.1109/ICRC2020.2020.00010","DOIUrl":null,"url":null,"abstract":"Spiking neural networks are attractive alternatives to conventional neural networks because of their ability to implement complex algorithms with low power and network complexity. On the flip side, they are difficult to train to solve specific problems. One approach to training is to train conventional neural networks with binary threshold activation functions, which may then be implemented with spikes. This is a powerful approach. However, when applied to neural networks with convolutional kernels, the spiking networks explode in size. In this work, we design multiple spiking computational modules, which reduce the size of the networks back to size of the conventional networks. They do so by taking advantage of the temporal nature of spiking neural networks. We evaluate the size reduction analytically and on classification examples. Finally, we compare and confirm the classification accuracy of their implementation on a discrete threshold neuroprocessor.","PeriodicalId":320580,"journal":{"name":"2020 International Conference on Rebooting Computing (ICRC)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"Reducing the Size of Spiking Convolutional Neural Networks by Trading Time for Space\",\"authors\":\"J. Plank, Jiajia Zhao, Brent Hurst\",\"doi\":\"10.1109/ICRC2020.2020.00010\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Spiking neural networks are attractive alternatives to conventional neural networks because of their ability to implement complex algorithms with low power and network complexity. On the flip side, they are difficult to train to solve specific problems. One approach to training is to train conventional neural networks with binary threshold activation functions, which may then be implemented with spikes. This is a powerful approach. However, when applied to neural networks with convolutional kernels, the spiking networks explode in size. In this work, we design multiple spiking computational modules, which reduce the size of the networks back to size of the conventional networks. They do so by taking advantage of the temporal nature of spiking neural networks. We evaluate the size reduction analytically and on classification examples. Finally, we compare and confirm the classification accuracy of their implementation on a discrete threshold neuroprocessor.\",\"PeriodicalId\":320580,\"journal\":{\"name\":\"2020 International Conference on Rebooting Computing (ICRC)\",\"volume\":\"6 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 International Conference on Rebooting Computing (ICRC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICRC2020.2020.00010\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 International Conference on Rebooting Computing (ICRC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICRC2020.2020.00010","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Reducing the Size of Spiking Convolutional Neural Networks by Trading Time for Space
Spiking neural networks are attractive alternatives to conventional neural networks because of their ability to implement complex algorithms with low power and network complexity. On the flip side, they are difficult to train to solve specific problems. One approach to training is to train conventional neural networks with binary threshold activation functions, which may then be implemented with spikes. This is a powerful approach. However, when applied to neural networks with convolutional kernels, the spiking networks explode in size. In this work, we design multiple spiking computational modules, which reduce the size of the networks back to size of the conventional networks. They do so by taking advantage of the temporal nature of spiking neural networks. We evaluate the size reduction analytically and on classification examples. Finally, we compare and confirm the classification accuracy of their implementation on a discrete threshold neuroprocessor.