A Low-cost Residue-based Scheme for Error-resiliency of RNN Accelerators

Nooshin Nosrati, Z. Navabi
{"title":"A Low-cost Residue-based Scheme for Error-resiliency of RNN Accelerators","authors":"Nooshin Nosrati, Z. Navabi","doi":"10.1109/DDECS57882.2023.10139388","DOIUrl":null,"url":null,"abstract":"Acceleration and power reduction requirements are usually the main constraints for the design of Artificial Neural Network (ANN) accelerators. However, in the case of safety-critical applications like autonomous driving, reliability takes precedence over other requirements. Although ANN algorithms provide a degree of inherent resiliency, the hardware part is still vulnerable to faults and may cause catastrophic failures. This paper proposes using residue codes for detecting soft errors in Recurrent Neural Networks (RNNs), and in particular, Long Short-Term Memory (LSTM) networks. We attach Concurrent Error Detection (CED) hardware units to an entire LSTM structure or its substructures. Depending on the granularity of the components to which they are applied, CEDs are referred to as coarse-grain or fine-grain CEDs. The simulation results show that in fault detection rate and misprediction coverage rate, fine-grain CEDs have a better performance than coarse-grain. Specifically, fine-grain residue-based CEDs provide up to 97% fault detection for extremely large (10-2) bit error rates. Moreover, they reduce the misprediction rate by 84% compared to unprotected LSTM.","PeriodicalId":220690,"journal":{"name":"2023 26th International Symposium on Design and Diagnostics of Electronic Circuits and Systems (DDECS)","volume":"121 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 26th International Symposium on Design and Diagnostics of Electronic Circuits and Systems (DDECS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DDECS57882.2023.10139388","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Acceleration and power reduction requirements are usually the main constraints for the design of Artificial Neural Network (ANN) accelerators. However, in the case of safety-critical applications like autonomous driving, reliability takes precedence over other requirements. Although ANN algorithms provide a degree of inherent resiliency, the hardware part is still vulnerable to faults and may cause catastrophic failures. This paper proposes using residue codes for detecting soft errors in Recurrent Neural Networks (RNNs), and in particular, Long Short-Term Memory (LSTM) networks. We attach Concurrent Error Detection (CED) hardware units to an entire LSTM structure or its substructures. Depending on the granularity of the components to which they are applied, CEDs are referred to as coarse-grain or fine-grain CEDs. The simulation results show that in fault detection rate and misprediction coverage rate, fine-grain CEDs have a better performance than coarse-grain. Specifically, fine-grain residue-based CEDs provide up to 97% fault detection for extremely large (10-2) bit error rates. Moreover, they reduce the misprediction rate by 84% compared to unprotected LSTM.
基于残差的低成本RNN加速器容错方案
加速和功耗降低要求通常是人工神经网络(ANN)加速器设计的主要制约因素。然而,在像自动驾驶这样的安全关键应用中,可靠性优先于其他要求。尽管人工神经网络算法提供了一定程度的固有弹性,但硬件部分仍然容易受到故障的影响,并可能导致灾难性的故障。本文提出利用残差码检测递归神经网络(rnn)中的软错误,特别是长短期记忆(LSTM)网络。我们将并发错误检测(CED)硬件单元附加到整个LSTM结构或其子结构上。根据所应用的组件的粒度,ced被称为粗粒度ced或细粒度ced。仿真结果表明,在故障检出率和误预测覆盖率方面,细粒ed优于粗粒ed。具体来说,基于细颗粒残留物的ced在极高(10-2)误码率下提供高达97%的故障检测。此外,与未受保护的LSTM相比,它们将错误预测率降低了84%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信