An energy-efficient and high-throughput bitwise CNN on sneak-path-free digital ReRAM crossbar

Leibin Ni, Zichuan Liu, Wenhao Song, J. Yang, Hao Yu, Kanwen Wang, Yuangang Wang
{"title":"An energy-efficient and high-throughput bitwise CNN on sneak-path-free digital ReRAM crossbar","authors":"Leibin Ni, Zichuan Liu, Wenhao Song, J. Yang, Hao Yu, Kanwen Wang, Yuangang Wang","doi":"10.1109/ISLPED.2017.8009177","DOIUrl":null,"url":null,"abstract":"Convolutional neural network (CNN) based machine learning requires a highly parallel as well as low power consumption (including leakage power) hardware accelerator. In this paper, we will present a digital ReRAM crossbar based CNN accelerator that can achieve significantly higher throughput and lower power consumption than state-of-arts. The CNN is trained with binary constraints on both weights and activations such that all operations become bitwise. With further use of 1-bit comparator, the bitwise CNN model can be naturally realized on a digital ReRAM-crossbar device. A novel sneak-path-free ReRAM-crossbar is further utilized for large-scale realization. Simulation experiments show that the bitwise CNN accelerator on the digital ReRAM crossbar achieves 98.3% and 91.4% accuracy on MNIST and CIFAR-10 benchmarks, respectively. Moreover, it has a peak throughput of 792GOPS at the power consumption of 6.3mW, which is 18.86 times higher throughput and 44.1 times lower power than CMOS CNN (non-binary) accelerators.","PeriodicalId":385714,"journal":{"name":"2017 IEEE/ACM International Symposium on Low Power Electronics and Design (ISLPED)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"22","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE/ACM International Symposium on Low Power Electronics and Design (ISLPED)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISLPED.2017.8009177","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 22

Abstract

Convolutional neural network (CNN) based machine learning requires a highly parallel as well as low power consumption (including leakage power) hardware accelerator. In this paper, we will present a digital ReRAM crossbar based CNN accelerator that can achieve significantly higher throughput and lower power consumption than state-of-arts. The CNN is trained with binary constraints on both weights and activations such that all operations become bitwise. With further use of 1-bit comparator, the bitwise CNN model can be naturally realized on a digital ReRAM-crossbar device. A novel sneak-path-free ReRAM-crossbar is further utilized for large-scale realization. Simulation experiments show that the bitwise CNN accelerator on the digital ReRAM crossbar achieves 98.3% and 91.4% accuracy on MNIST and CIFAR-10 benchmarks, respectively. Moreover, it has a peak throughput of 792GOPS at the power consumption of 6.3mW, which is 18.86 times higher throughput and 44.1 times lower power than CMOS CNN (non-binary) accelerators.
一种基于无隐路径数字ReRAM交叉棒的高能效高吞吐量位CNN
基于卷积神经网络(CNN)的机器学习需要一个高度并行、低功耗(包括泄漏功率)的硬件加速器。在本文中,我们将介绍一种基于数字ReRAM交叉棒的CNN加速器,它可以实现比最先进的更高的吞吐量和更低的功耗。CNN在权重和激活上都使用二进制约束进行训练,这样所有的操作都是按位进行的。通过进一步使用1位比较器,可以在数字ReRAM-crossbar器件上自然地实现按位CNN模型。在此基础上,进一步利用了一种新的无隐路径的ReRAM-crossbar进行大规模实现。仿真实验表明,在MNIST和CIFAR-10基准测试中,基于数字ReRAM交叉棒的按位CNN加速器的准确率分别达到98.3%和91.4%。在6.3mW的功耗下,其峰值吞吐量为792GOPS,比CMOS CNN(非二进制)加速器高18.86倍,功耗低44.1倍。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信