Quarry: Quantization-based ADC Reduction for ReRAM-based Deep Neural Network Accelerators

Azat Azamat, Faaiz Asim, Jongeun Lee
{"title":"Quarry: Quantization-based ADC Reduction for ReRAM-based Deep Neural Network Accelerators","authors":"Azat Azamat, Faaiz Asim, Jongeun Lee","doi":"10.1109/ICCAD51958.2021.9643502","DOIUrl":null,"url":null,"abstract":"ReRAM (Resistive Random-Access Memory) crossbar arrays have the potential to provide extremely fast and low-cost DNN (Deep Neural Network) acceleration. However, peripheral circuits, in particular ADCs (Analog-Digital Converters), can be a large overhead and/or slow down the operation considerably. In this paper we propose to use advanced quantization techniques to reduce the ADC overhead of ReRAM crossbar arrays. Our method does not require any hardware change but can reduce the overhead of ADC greatly. Our methodology is also general, having no restriction in terms of DNN type (binarized or multi-bit) or ReRAM crossbar array size. Our experimental results using ResNet on ImageNet dataset demonstrate that our method can reduce the size of ADC by 32× compared with ISAAC at very little accuracy loss of 0.24%p.","PeriodicalId":370791,"journal":{"name":"2021 IEEE/ACM International Conference On Computer Aided Design (ICCAD)","volume":"520 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE/ACM International Conference On Computer Aided Design (ICCAD)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCAD51958.2021.9643502","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10

Abstract

ReRAM (Resistive Random-Access Memory) crossbar arrays have the potential to provide extremely fast and low-cost DNN (Deep Neural Network) acceleration. However, peripheral circuits, in particular ADCs (Analog-Digital Converters), can be a large overhead and/or slow down the operation considerably. In this paper we propose to use advanced quantization techniques to reduce the ADC overhead of ReRAM crossbar arrays. Our method does not require any hardware change but can reduce the overhead of ADC greatly. Our methodology is also general, having no restriction in terms of DNN type (binarized or multi-bit) or ReRAM crossbar array size. Our experimental results using ResNet on ImageNet dataset demonstrate that our method can reduce the size of ADC by 32× compared with ISAAC at very little accuracy loss of 0.24%p.
基于reram的深度神经网络加速器的量化ADC缩减
ReRAM(电阻随机存取存储器)交叉棒阵列具有提供极快和低成本DNN(深度神经网络)加速的潜力。然而,外围电路,特别是adc(模数转换器),可能是一个很大的开销和/或相当慢的操作。在本文中,我们建议使用先进的量化技术来减少ReRAM交叉棒阵列的ADC开销。我们的方法不需要任何硬件改变,但可以大大减少ADC的开销。我们的方法也是通用的,在DNN类型(二值化或多比特)或ReRAM交叉条数组大小方面没有限制。我们在ImageNet数据集上使用ResNet进行的实验结果表明,与ISAAC相比,我们的方法可以将ADC的大小减少32倍,精度损失极小,为0.24%p。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信