Ming-Hung Wu, Ming-Shun Huang, Zhifeng Zhu, Fu-Xiang Liang, Ming-Chun Hong, Jiefang Deng, Jeng-Hua Wei, S. Sheu, Chih-I Wu, G. Liang, T. Hou
{"title":"Compact Probabilistic Poisson Neuron based on Back-Hopping Oscillation in STT-MRAM for All-Spin Deep Spiking Neural Network","authors":"Ming-Hung Wu, Ming-Shun Huang, Zhifeng Zhu, Fu-Xiang Liang, Ming-Chun Hong, Jiefang Deng, Jeng-Hua Wei, S. Sheu, Chih-I Wu, G. Liang, T. Hou","doi":"10.1109/VLSITechnology18217.2020.9265033","DOIUrl":null,"url":null,"abstract":"A unique compact Poisson neuron that encodes information in the tunable duty cycle of probabilistic spike trains is presented as an enabling technology for cost-effective spiking neural network (SNN) hardware. The Poisson neuron exploits the back-hopping oscillation (BHO) in scalable spin-transfer torque (STT)-MRAM. The macrospin LLGS simulation confirms that the coupled local Joule heating and STT effects are responsible for the bias-dependent BHO. The complete neuron circuit design is at least $6\\times$ smaller than the state-of-the-art integrate-and- fire (IF) CMOS neuron. Hardware-friendly all-spin deep SNNs achieve equivalent accuracy to deep neural networks (DNN), 98.4 % for MNIST, even when considering the probabilistic nature of neurons.","PeriodicalId":6850,"journal":{"name":"2020 IEEE Symposium on VLSI Technology","volume":"32 1","pages":"1-2"},"PeriodicalIF":0.0000,"publicationDate":"2020-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE Symposium on VLSI Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/VLSITechnology18217.2020.9265033","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8
Abstract
A unique compact Poisson neuron that encodes information in the tunable duty cycle of probabilistic spike trains is presented as an enabling technology for cost-effective spiking neural network (SNN) hardware. The Poisson neuron exploits the back-hopping oscillation (BHO) in scalable spin-transfer torque (STT)-MRAM. The macrospin LLGS simulation confirms that the coupled local Joule heating and STT effects are responsible for the bias-dependent BHO. The complete neuron circuit design is at least $6\times$ smaller than the state-of-the-art integrate-and- fire (IF) CMOS neuron. Hardware-friendly all-spin deep SNNs achieve equivalent accuracy to deep neural networks (DNN), 98.4 % for MNIST, even when considering the probabilistic nature of neurons.