An RRAM retention prediction framework using a convolutional neural network based on relaxation behavior

Yibei Zhang, Qingtian Zhang, Qi Qin, Wenbin Zhang, Yue Xi, Zhixing Jiang, Jianshi Tang, B. Gao, H. Qian, Huaqiang Wu
{"title":"An RRAM retention prediction framework using a convolutional neural network based on relaxation behavior","authors":"Yibei Zhang, Qingtian Zhang, Qi Qin, Wenbin Zhang, Yue Xi, Zhixing Jiang, Jianshi Tang, B. Gao, H. Qian, Huaqiang Wu","doi":"10.1088/2634-4386/acb965","DOIUrl":null,"url":null,"abstract":"The long-time retention issue of resistive random access memory (RRAM) brings a great challenge in the performance maintenance of large-scale RRAM-based computation-in-memory (CIM) systems. The periodic update is a feasible method to compensate for the accuracy loss caused by retention degradation, especially in demanding high-accuracy applications. In this paper, we propose a selective refresh strategy to reduce the updating cost by predicting the devices’ retention behavior. A convolutional neural network-based retention prediction framework is developed. The framework can determine whether an RRAM device has poor retention that needs to be updated according to its short-time relaxation behavior. By reprogramming these few selected devices, the method can recover the accuracy of the RRAM-based CIM system effectively. This work provides a valuable retention coping strategy with low time and energy costs and new insights for analyzing the physical connection between the relaxation and retention behavior of the RRAM device.","PeriodicalId":198030,"journal":{"name":"Neuromorphic Computing and Engineering","volume":"20 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-02-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neuromorphic Computing and Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1088/2634-4386/acb965","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

The long-time retention issue of resistive random access memory (RRAM) brings a great challenge in the performance maintenance of large-scale RRAM-based computation-in-memory (CIM) systems. The periodic update is a feasible method to compensate for the accuracy loss caused by retention degradation, especially in demanding high-accuracy applications. In this paper, we propose a selective refresh strategy to reduce the updating cost by predicting the devices’ retention behavior. A convolutional neural network-based retention prediction framework is developed. The framework can determine whether an RRAM device has poor retention that needs to be updated according to its short-time relaxation behavior. By reprogramming these few selected devices, the method can recover the accuracy of the RRAM-based CIM system effectively. This work provides a valuable retention coping strategy with low time and energy costs and new insights for analyzing the physical connection between the relaxation and retention behavior of the RRAM device.
基于松弛行为的卷积神经网络的RRAM保留预测框架
电阻式随机存取存储器(RRAM)的长时间保留问题给基于RRAM的大规模内存计算系统的性能维护带来了巨大的挑战。在精度要求较高的应用中,定期更新是一种补偿保留度下降造成的精度损失的可行方法。本文提出了一种选择性刷新策略,通过预测设备的保留行为来降低更新成本。提出了一种基于卷积神经网络的留存率预测框架。该框架可以根据RRAM器件的短时松弛行为判断其是否具有较差的保留性,是否需要更新。通过对所选器件的重新编程,该方法可以有效地恢复基于rram的CIM系统的精度。这项工作提供了一种有价值的低时间和能量成本的保留应对策略,并为分析RRAM器件的松弛和保留行为之间的物理联系提供了新的见解。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
5.90
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信