The True Cost of Errors in Emerging Memory Devices: A Worst-Case Analysis of Device Errors in IMC for Safety-Critical Applications

Alptekin Vardar, Li Zhang, Saiyam Bherulal Jain, Shaown Mojumder, N. Laleni, S. De, T. Kämpfe
{"title":"The True Cost of Errors in Emerging Memory Devices: A Worst-Case Analysis of Device Errors in IMC for Safety-Critical Applications","authors":"Alptekin Vardar, Li Zhang, Saiyam Bherulal Jain, Shaown Mojumder, N. Laleni, S. De, T. Kämpfe","doi":"10.1109/SMACD58065.2023.10192126","DOIUrl":null,"url":null,"abstract":"In-memory computing devices are prone to errors that can significantly affect the accuracy of neural network inference. While average accuracy loss is often used to evaluate the impact of such errors, this metric may not be reliable for safety-critical systems where worst-case performance is crucial. In this work, we present a comprehensive statistical analysis of the variability in the accuracy of quantized neural networks. We conduct experiments on two well-known neural network architectures, LeNet-5 and ResNet20, using both 4-bit and 8- bit quantization, and measure the worst-case impact of errors on model accuracy. Our results demonstrate that worst-case variation is much more significant than the impact on average accuracy and that 8-bit quantization is more susceptible to errors. We also investigate the potential of intra-layer mixed error injection to mitigate the effects of errors and show that it can improve the worst-case accuracy of neural networks.","PeriodicalId":239306,"journal":{"name":"2023 19th International Conference on Synthesis, Modeling, Analysis and Simulation Methods and Applications to Circuit Design (SMACD)","volume":"617 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 19th International Conference on Synthesis, Modeling, Analysis and Simulation Methods and Applications to Circuit Design (SMACD)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SMACD58065.2023.10192126","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In-memory computing devices are prone to errors that can significantly affect the accuracy of neural network inference. While average accuracy loss is often used to evaluate the impact of such errors, this metric may not be reliable for safety-critical systems where worst-case performance is crucial. In this work, we present a comprehensive statistical analysis of the variability in the accuracy of quantized neural networks. We conduct experiments on two well-known neural network architectures, LeNet-5 and ResNet20, using both 4-bit and 8- bit quantization, and measure the worst-case impact of errors on model accuracy. Our results demonstrate that worst-case variation is much more significant than the impact on average accuracy and that 8-bit quantization is more susceptible to errors. We also investigate the potential of intra-layer mixed error injection to mitigate the effects of errors and show that it can improve the worst-case accuracy of neural networks.
新兴存储器器件错误的真正代价:安全关键应用中IMC器件错误的最坏情况分析
内存计算设备容易出现错误,严重影响神经网络推理的准确性。虽然平均精度损失通常用于评估此类错误的影响,但对于最坏情况性能至关重要的安全关键系统,该指标可能不可靠。在这项工作中,我们提出了量化神经网络精度变异性的全面统计分析。我们在LeNet-5和ResNet20两种著名的神经网络架构上进行了实验,使用4位和8位量化,并测量了误差对模型精度的最坏情况影响。我们的结果表明,最坏情况的变化比平均精度的影响更显著,8位量化更容易受到误差的影响。我们还研究了层内混合错误注入的潜力,以减轻错误的影响,并表明它可以提高神经网络的最坏情况精度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信