基于磁隧道结的贝叶斯神经网络的概率内存计算

IF 4.1 Q2 MATERIALS SCIENCE, MULTIDISCIPLINARY
Samuel Liu, T. Xiao, J. Kwon, B. Debusschere, S. Agarwal, J. Incorvia, C. Bennett
{"title":"基于磁隧道结的贝叶斯神经网络的概率内存计算","authors":"Samuel Liu, T. Xiao, J. Kwon, B. Debusschere, S. Agarwal, J. Incorvia, C. Bennett","doi":"10.3389/fnano.2022.1021943","DOIUrl":null,"url":null,"abstract":"Bayesian neural networks (BNNs) combine the generalizability of deep neural networks (DNNs) with a rigorous quantification of predictive uncertainty, which mitigates overfitting and makes them valuable for high-reliability or safety-critical applications. However, the probabilistic nature of BNNs makes them more computationally intensive on digital hardware and so far, less directly amenable to acceleration by analog in-memory computing as compared to DNNs. This work exploits a novel spintronic bit cell that efficiently and compactly implements Gaussian-distributed BNN values. Specifically, the bit cell combines a tunable stochastic magnetic tunnel junction (MTJ) encoding the trained standard deviation and a multi-bit domain-wall MTJ device independently encoding the trained mean. The two devices can be integrated within the same array, enabling highly efficient, fully analog, probabilistic matrix-vector multiplications. We use micromagnetics simulations as the basis of a system-level model of the spintronic BNN accelerator, demonstrating that our design yields accurate, well-calibrated uncertainty estimates for both classification and regression problems and matches software BNN performance. This result paves the way to spintronic in-memory computing systems implementing trusted neural networks at a modest energy budget.","PeriodicalId":34432,"journal":{"name":"Frontiers in Nanotechnology","volume":" ","pages":""},"PeriodicalIF":4.1000,"publicationDate":"2022-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"Bayesian neural networks using magnetic tunnel junction-based probabilistic in-memory computing\",\"authors\":\"Samuel Liu, T. Xiao, J. Kwon, B. Debusschere, S. Agarwal, J. Incorvia, C. Bennett\",\"doi\":\"10.3389/fnano.2022.1021943\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Bayesian neural networks (BNNs) combine the generalizability of deep neural networks (DNNs) with a rigorous quantification of predictive uncertainty, which mitigates overfitting and makes them valuable for high-reliability or safety-critical applications. However, the probabilistic nature of BNNs makes them more computationally intensive on digital hardware and so far, less directly amenable to acceleration by analog in-memory computing as compared to DNNs. This work exploits a novel spintronic bit cell that efficiently and compactly implements Gaussian-distributed BNN values. Specifically, the bit cell combines a tunable stochastic magnetic tunnel junction (MTJ) encoding the trained standard deviation and a multi-bit domain-wall MTJ device independently encoding the trained mean. The two devices can be integrated within the same array, enabling highly efficient, fully analog, probabilistic matrix-vector multiplications. We use micromagnetics simulations as the basis of a system-level model of the spintronic BNN accelerator, demonstrating that our design yields accurate, well-calibrated uncertainty estimates for both classification and regression problems and matches software BNN performance. This result paves the way to spintronic in-memory computing systems implementing trusted neural networks at a modest energy budget.\",\"PeriodicalId\":34432,\"journal\":{\"name\":\"Frontiers in Nanotechnology\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":4.1000,\"publicationDate\":\"2022-10-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Frontiers in Nanotechnology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3389/fnano.2022.1021943\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATERIALS SCIENCE, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Nanotechnology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3389/fnano.2022.1021943","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATERIALS SCIENCE, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 9

摘要

贝叶斯神经网络(BNN)将深度神经网络(DNN)的可推广性与预测不确定性的严格量化相结合,这缓解了过拟合,使其在高可靠性或安全关键应用中具有价值。然而,与DNN相比,BNN的概率性质使其在数字硬件上的计算更密集,并且到目前为止,不太容易直接受到模拟内存计算的加速。这项工作利用了一种新型的自旋电子比特单元,该单元有效而紧凑地实现高斯分布的BNN值。具体地,比特单元结合了对训练的标准偏差进行编码的可调谐随机磁隧道结(MTJ)和独立地对训练的平均值进行编码的多位畴壁MTJ装置。这两个设备可以集成在同一阵列中,从而实现高效、完全模拟的概率矩阵矢量乘法。我们使用微磁学模拟作为自旋电子BNN加速器系统级模型的基础,证明我们的设计为分类和回归问题提供了准确、校准良好的不确定性估计,并与软件BNN性能相匹配。这一结果为在适度的能源预算下实现可信神经网络的自旋电子内存计算系统铺平了道路。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Bayesian neural networks using magnetic tunnel junction-based probabilistic in-memory computing
Bayesian neural networks (BNNs) combine the generalizability of deep neural networks (DNNs) with a rigorous quantification of predictive uncertainty, which mitigates overfitting and makes them valuable for high-reliability or safety-critical applications. However, the probabilistic nature of BNNs makes them more computationally intensive on digital hardware and so far, less directly amenable to acceleration by analog in-memory computing as compared to DNNs. This work exploits a novel spintronic bit cell that efficiently and compactly implements Gaussian-distributed BNN values. Specifically, the bit cell combines a tunable stochastic magnetic tunnel junction (MTJ) encoding the trained standard deviation and a multi-bit domain-wall MTJ device independently encoding the trained mean. The two devices can be integrated within the same array, enabling highly efficient, fully analog, probabilistic matrix-vector multiplications. We use micromagnetics simulations as the basis of a system-level model of the spintronic BNN accelerator, demonstrating that our design yields accurate, well-calibrated uncertainty estimates for both classification and regression problems and matches software BNN performance. This result paves the way to spintronic in-memory computing systems implementing trusted neural networks at a modest energy budget.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Frontiers in Nanotechnology
Frontiers in Nanotechnology Engineering-Electrical and Electronic Engineering
CiteScore
7.10
自引率
0.00%
发文量
96
审稿时长
13 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信