Ag-CBRAM交叉棒和Mott ReLU神经元的集成,在硬件上有效实现深度神经网络

Yuhan Shi, Sangheon Oh, Jaeseoung Park, J. D. Valle, Pavel Salev, Ivan K. Schuller, D. Kuzum
{"title":"Ag-CBRAM交叉棒和Mott ReLU神经元的集成,在硬件上有效实现深度神经网络","authors":"Yuhan Shi, Sangheon Oh, Jaeseoung Park, J. D. Valle, Pavel Salev, Ivan K. Schuller, D. Kuzum","doi":"10.1088/2634-4386/aceea9","DOIUrl":null,"url":null,"abstract":"In-memory computing with emerging non-volatile memory devices (eNVMs) has shown promising results in accelerating matrix-vector multiplications. However, activation function calculations are still being implemented with general processors or large and complex neuron peripheral circuits. Here, we present the integration of Ag-based conductive bridge random access memory (Ag-CBRAM) crossbar arrays with Mott rectified linear unit (ReLU) activation neurons for scalable, energy and area-efficient hardware (HW) implementation of deep neural networks. We develop Ag-CBRAM devices that can achieve a high ON/OFF ratio and multi-level programmability. Compact and energy-efficient Mott ReLU neuron devices implementing ReLU activation function are directly connected to the columns of Ag-CBRAM crossbars to compute the output from the weighted sum current. We implement convolution filters and activations for VGG-16 using our integrated HW and demonstrate the successful generation of feature maps for CIFAR-10 images in HW. Our approach paves a new way toward building a highly compact and energy-efficient eNVMs-based in-memory computing system.","PeriodicalId":198030,"journal":{"name":"Neuromorphic Computing and Engineering","volume":"31 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-08-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Integration of Ag-CBRAM crossbars and Mott ReLU neurons for efficient implementation of deep neural networks in hardware\",\"authors\":\"Yuhan Shi, Sangheon Oh, Jaeseoung Park, J. D. Valle, Pavel Salev, Ivan K. Schuller, D. Kuzum\",\"doi\":\"10.1088/2634-4386/aceea9\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In-memory computing with emerging non-volatile memory devices (eNVMs) has shown promising results in accelerating matrix-vector multiplications. However, activation function calculations are still being implemented with general processors or large and complex neuron peripheral circuits. Here, we present the integration of Ag-based conductive bridge random access memory (Ag-CBRAM) crossbar arrays with Mott rectified linear unit (ReLU) activation neurons for scalable, energy and area-efficient hardware (HW) implementation of deep neural networks. We develop Ag-CBRAM devices that can achieve a high ON/OFF ratio and multi-level programmability. Compact and energy-efficient Mott ReLU neuron devices implementing ReLU activation function are directly connected to the columns of Ag-CBRAM crossbars to compute the output from the weighted sum current. We implement convolution filters and activations for VGG-16 using our integrated HW and demonstrate the successful generation of feature maps for CIFAR-10 images in HW. Our approach paves a new way toward building a highly compact and energy-efficient eNVMs-based in-memory computing system.\",\"PeriodicalId\":198030,\"journal\":{\"name\":\"Neuromorphic Computing and Engineering\",\"volume\":\"31 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-08-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neuromorphic Computing and Engineering\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1088/2634-4386/aceea9\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neuromorphic Computing and Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1088/2634-4386/aceea9","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

使用新兴的非易失性存储设备(envm)进行内存计算在加速矩阵向量乘法方面显示出了很好的结果。然而,激活函数计算仍然是在通用处理器或大而复杂的神经元外围电路中实现的。在这里,我们提出了基于银的导电桥随机存取存储器(Ag-CBRAM)交叉棒阵列与Mott整流线性单元(ReLU)激活神经元的集成,用于深度神经网络的可扩展,能量和面积高效的硬件(HW)实现。我们开发的Ag-CBRAM器件可以实现高开/关比和多级可编程性。实现ReLU激活功能的Mott ReLU神经元装置紧凑节能,直接连接到Ag-CBRAM横条的列上,计算加权和电流的输出。我们使用集成的硬件实现了VGG-16的卷积滤波器和激活,并演示了在硬件中成功生成CIFAR-10图像的特征图。我们的方法为构建高度紧凑和节能的基于envm的内存计算系统铺平了新的道路。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Integration of Ag-CBRAM crossbars and Mott ReLU neurons for efficient implementation of deep neural networks in hardware
In-memory computing with emerging non-volatile memory devices (eNVMs) has shown promising results in accelerating matrix-vector multiplications. However, activation function calculations are still being implemented with general processors or large and complex neuron peripheral circuits. Here, we present the integration of Ag-based conductive bridge random access memory (Ag-CBRAM) crossbar arrays with Mott rectified linear unit (ReLU) activation neurons for scalable, energy and area-efficient hardware (HW) implementation of deep neural networks. We develop Ag-CBRAM devices that can achieve a high ON/OFF ratio and multi-level programmability. Compact and energy-efficient Mott ReLU neuron devices implementing ReLU activation function are directly connected to the columns of Ag-CBRAM crossbars to compute the output from the weighted sum current. We implement convolution filters and activations for VGG-16 using our integrated HW and demonstrate the successful generation of feature maps for CIFAR-10 images in HW. Our approach paves a new way toward building a highly compact and energy-efficient eNVMs-based in-memory computing system.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
5.90
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信