Zhen Gao, Xiaohui Wei, Han Zhang, Wenshuo Li, Guangjun Ge, Yu Wang, P. Reviriego
{"title":"基于参数误差的剪枝神经网络可靠性评估","authors":"Zhen Gao, Xiaohui Wei, Han Zhang, Wenshuo Li, Guangjun Ge, Yu Wang, P. Reviriego","doi":"10.1109/DFT50435.2020.9250812","DOIUrl":null,"url":null,"abstract":"Convolutional Neural Networks (CNNs) are widely used in image classification tasks. To fit the application of CNNs on resource-limited embedded systems, pruning is a popular technique to reduce the complexity of the network. In this paper, the robustness of the pruned network against errors on the network parameters is examined with VGG16 as a case study. The effects of errors on the weights, bias, and batch normalization (BN) parameters are evaluated for the network with different pruning rates based on error injection experiments. The results show that in general networks with more weights pruned are more robust for a given error rate. The effect of multiple errors on bias or BN parameters is almost the same for the networks with different pruning rates that are lower than 90%. Further experiments are performed to explain the bimodal phenomenon of the network performance with errors on the parameters, to find that only errors on 6% of the parameter bits will cause large degradation of the neural network performance.","PeriodicalId":340119,"journal":{"name":"2020 IEEE International Symposium on Defect and Fault Tolerance in VLSI and Nanotechnology Systems (DFT)","volume":"97 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"11","resultStr":"{\"title\":\"Reliability Evaluation of Pruned Neural Networks against Errors on Parameters\",\"authors\":\"Zhen Gao, Xiaohui Wei, Han Zhang, Wenshuo Li, Guangjun Ge, Yu Wang, P. Reviriego\",\"doi\":\"10.1109/DFT50435.2020.9250812\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Convolutional Neural Networks (CNNs) are widely used in image classification tasks. To fit the application of CNNs on resource-limited embedded systems, pruning is a popular technique to reduce the complexity of the network. In this paper, the robustness of the pruned network against errors on the network parameters is examined with VGG16 as a case study. The effects of errors on the weights, bias, and batch normalization (BN) parameters are evaluated for the network with different pruning rates based on error injection experiments. The results show that in general networks with more weights pruned are more robust for a given error rate. The effect of multiple errors on bias or BN parameters is almost the same for the networks with different pruning rates that are lower than 90%. Further experiments are performed to explain the bimodal phenomenon of the network performance with errors on the parameters, to find that only errors on 6% of the parameter bits will cause large degradation of the neural network performance.\",\"PeriodicalId\":340119,\"journal\":{\"name\":\"2020 IEEE International Symposium on Defect and Fault Tolerance in VLSI and Nanotechnology Systems (DFT)\",\"volume\":\"97 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-10-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"11\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 IEEE International Symposium on Defect and Fault Tolerance in VLSI and Nanotechnology Systems (DFT)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/DFT50435.2020.9250812\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE International Symposium on Defect and Fault Tolerance in VLSI and Nanotechnology Systems (DFT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DFT50435.2020.9250812","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Reliability Evaluation of Pruned Neural Networks against Errors on Parameters
Convolutional Neural Networks (CNNs) are widely used in image classification tasks. To fit the application of CNNs on resource-limited embedded systems, pruning is a popular technique to reduce the complexity of the network. In this paper, the robustness of the pruned network against errors on the network parameters is examined with VGG16 as a case study. The effects of errors on the weights, bias, and batch normalization (BN) parameters are evaluated for the network with different pruning rates based on error injection experiments. The results show that in general networks with more weights pruned are more robust for a given error rate. The effect of multiple errors on bias or BN parameters is almost the same for the networks with different pruning rates that are lower than 90%. Further experiments are performed to explain the bimodal phenomenon of the network performance with errors on the parameters, to find that only errors on 6% of the parameter bits will cause large degradation of the neural network performance.