{"title":"基于随机存储器的交叉棒和光神经网络的神经形态计算鲁棒性","authors":"Grace Li Zhang, Bing Li, Ying Zhu, Tianchen Wang, Yiyu Shi, Xunzhao Yin, Cheng Zhuo, Huaxi Gu, Tsung-Yi Ho, Ulf Schlichtmann, Xunzhao, Yin","doi":"10.1145/3394885.3431634","DOIUrl":null,"url":null,"abstract":"RRAM-based crossbars and optical neural networks are attractive platforms to accelerate neuromorphic computing. However, both accelerators suffer from hardware uncertainties such as process variations. These uncertainty issues left unaddressed, the inference accuracy of these computing platforms can degrade significantly. In this paper, a statistical training method where weights under process variations and noise are modeled as statistical random variables is presented. To incorporate these statistical weights into training, the computations in neural networks are modified accordingly. For optical neural networks, we modify the cost function during software training to reduce the effects of process variations and thermal imbalance. In addition, the residual effects of process variations are extracted and calibrated in hardware test, and thermal variations on devices are also compensated in advance. Simulation results demonstrate that the inference accuracy can be improved significantly under hardware uncertainties for both platforms.","PeriodicalId":186307,"journal":{"name":"2021 26th Asia and South Pacific Design Automation Conference (ASP-DAC)","volume":"206 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-01-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Robustness of Neuromorphic Computing with RRAM-based Crossbars and Optical Neural Networks\",\"authors\":\"Grace Li Zhang, Bing Li, Ying Zhu, Tianchen Wang, Yiyu Shi, Xunzhao Yin, Cheng Zhuo, Huaxi Gu, Tsung-Yi Ho, Ulf Schlichtmann, Xunzhao, Yin\",\"doi\":\"10.1145/3394885.3431634\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"RRAM-based crossbars and optical neural networks are attractive platforms to accelerate neuromorphic computing. However, both accelerators suffer from hardware uncertainties such as process variations. These uncertainty issues left unaddressed, the inference accuracy of these computing platforms can degrade significantly. In this paper, a statistical training method where weights under process variations and noise are modeled as statistical random variables is presented. To incorporate these statistical weights into training, the computations in neural networks are modified accordingly. For optical neural networks, we modify the cost function during software training to reduce the effects of process variations and thermal imbalance. In addition, the residual effects of process variations are extracted and calibrated in hardware test, and thermal variations on devices are also compensated in advance. Simulation results demonstrate that the inference accuracy can be improved significantly under hardware uncertainties for both platforms.\",\"PeriodicalId\":186307,\"journal\":{\"name\":\"2021 26th Asia and South Pacific Design Automation Conference (ASP-DAC)\",\"volume\":\"206 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-01-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 26th Asia and South Pacific Design Automation Conference (ASP-DAC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3394885.3431634\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 26th Asia and South Pacific Design Automation Conference (ASP-DAC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3394885.3431634","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Robustness of Neuromorphic Computing with RRAM-based Crossbars and Optical Neural Networks
RRAM-based crossbars and optical neural networks are attractive platforms to accelerate neuromorphic computing. However, both accelerators suffer from hardware uncertainties such as process variations. These uncertainty issues left unaddressed, the inference accuracy of these computing platforms can degrade significantly. In this paper, a statistical training method where weights under process variations and noise are modeled as statistical random variables is presented. To incorporate these statistical weights into training, the computations in neural networks are modified accordingly. For optical neural networks, we modify the cost function during software training to reduce the effects of process variations and thermal imbalance. In addition, the residual effects of process variations are extracted and calibrated in hardware test, and thermal variations on devices are also compensated in advance. Simulation results demonstrate that the inference accuracy can be improved significantly under hardware uncertainties for both platforms.