磁隧道连接突触的神经形态Hebbian学习。

Peng Zhou, Alexander J Edwards, Frederick B Mancoff, Sanjeev Aggarwal, Stephen K Heinrich-Barna, Joseph S Friedman
{"title":"磁隧道连接突触的神经形态Hebbian学习。","authors":"Peng Zhou, Alexander J Edwards, Frederick B Mancoff, Sanjeev Aggarwal, Stephen K Heinrich-Barna, Joseph S Friedman","doi":"10.1038/s44172-025-00479-2","DOIUrl":null,"url":null,"abstract":"<p><p>Neuromorphic computing aims to mimic both the function and structure of biological neural networks to provide artificial intelligence with extreme efficiency. Conventional approaches store synaptic weights in non-volatile memory devices with analog resistance states, permitting in-memory computation of neural network operations while avoiding the costs of transferring synaptic weights from memory. However, the use of analog resistance states for storing weights in neuromorphic systems is impeded by stochastic writing, weights drifting over time through stochastic processes, and limited endurance that reduces the precision of synapse weights. Here we propose and experimentally demonstrate neuromorphic networks that provide high-accuracy inference thanks to the binary resistance states of magnetic tunnel junctions (MTJs), while leveraging the analog nature of their stochastic spin-transfer torque (STT) switching for unsupervised Hebbian learning. We performed an experimental demonstration of a neuromorphic network directly implemented with MTJ synapses, for both inference and spike-timing-dependent plasticity learning. We also demonstrated through simulation that the proposed system for unsupervised Hebbian learning with stochastic STT-MTJ synapses can achieve competitive accuracies for MNIST handwritten digit recognition. By appropriately applying neuromorphic principles through hardware-aware design, the proposed STT-MTJ neuromorphic learning networks provide a pathway toward artificial intelligence hardware that learns autonomously with extreme efficiency.</p>","PeriodicalId":72644,"journal":{"name":"Communications engineering","volume":"4 1","pages":"142"},"PeriodicalIF":0.0000,"publicationDate":"2025-08-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12322246/pdf/","citationCount":"0","resultStr":"{\"title\":\"Neuromorphic Hebbian learning with magnetic tunnel junction synapses.\",\"authors\":\"Peng Zhou, Alexander J Edwards, Frederick B Mancoff, Sanjeev Aggarwal, Stephen K Heinrich-Barna, Joseph S Friedman\",\"doi\":\"10.1038/s44172-025-00479-2\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Neuromorphic computing aims to mimic both the function and structure of biological neural networks to provide artificial intelligence with extreme efficiency. Conventional approaches store synaptic weights in non-volatile memory devices with analog resistance states, permitting in-memory computation of neural network operations while avoiding the costs of transferring synaptic weights from memory. However, the use of analog resistance states for storing weights in neuromorphic systems is impeded by stochastic writing, weights drifting over time through stochastic processes, and limited endurance that reduces the precision of synapse weights. Here we propose and experimentally demonstrate neuromorphic networks that provide high-accuracy inference thanks to the binary resistance states of magnetic tunnel junctions (MTJs), while leveraging the analog nature of their stochastic spin-transfer torque (STT) switching for unsupervised Hebbian learning. We performed an experimental demonstration of a neuromorphic network directly implemented with MTJ synapses, for both inference and spike-timing-dependent plasticity learning. We also demonstrated through simulation that the proposed system for unsupervised Hebbian learning with stochastic STT-MTJ synapses can achieve competitive accuracies for MNIST handwritten digit recognition. By appropriately applying neuromorphic principles through hardware-aware design, the proposed STT-MTJ neuromorphic learning networks provide a pathway toward artificial intelligence hardware that learns autonomously with extreme efficiency.</p>\",\"PeriodicalId\":72644,\"journal\":{\"name\":\"Communications engineering\",\"volume\":\"4 1\",\"pages\":\"142\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2025-08-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12322246/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Communications engineering\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1038/s44172-025-00479-2\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Communications engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1038/s44172-025-00479-2","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

神经形态计算旨在模拟生物神经网络的功能和结构,以极高的效率提供人工智能。传统方法将突触权值存储在具有模拟电阻状态的非易失性存储设备中,允许在内存中计算神经网络操作,同时避免了从内存转移突触权值的成本。然而,在神经形态系统中使用模拟阻抗状态来存储权值受到随机书写、权值随随机过程随时间漂移以及有限的耐力(降低了突触权值的精度)的阻碍。在这里,我们提出并实验证明了神经形态网络,由于磁隧道结(MTJs)的二元电阻状态提供高精度推理,同时利用其随机自旋传递扭矩(STT)切换的模拟性质进行无监督Hebbian学习。我们进行了一个神经形态网络的实验演示,该网络直接实现了MTJ突触,用于推理和峰值时间依赖的可塑性学习。我们还通过仿真证明了所提出的具有随机STT-MTJ突触的无监督Hebbian学习系统可以实现MNIST手写数字识别的竞争精度。通过硬件感知设计适当地应用神经形态原理,所提出的STT-MTJ神经形态学习网络为人工智能硬件以极高的效率自主学习提供了一条途径。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Neuromorphic Hebbian learning with magnetic tunnel junction synapses.

Neuromorphic Hebbian learning with magnetic tunnel junction synapses.

Neuromorphic Hebbian learning with magnetic tunnel junction synapses.

Neuromorphic Hebbian learning with magnetic tunnel junction synapses.

Neuromorphic computing aims to mimic both the function and structure of biological neural networks to provide artificial intelligence with extreme efficiency. Conventional approaches store synaptic weights in non-volatile memory devices with analog resistance states, permitting in-memory computation of neural network operations while avoiding the costs of transferring synaptic weights from memory. However, the use of analog resistance states for storing weights in neuromorphic systems is impeded by stochastic writing, weights drifting over time through stochastic processes, and limited endurance that reduces the precision of synapse weights. Here we propose and experimentally demonstrate neuromorphic networks that provide high-accuracy inference thanks to the binary resistance states of magnetic tunnel junctions (MTJs), while leveraging the analog nature of their stochastic spin-transfer torque (STT) switching for unsupervised Hebbian learning. We performed an experimental demonstration of a neuromorphic network directly implemented with MTJ synapses, for both inference and spike-timing-dependent plasticity learning. We also demonstrated through simulation that the proposed system for unsupervised Hebbian learning with stochastic STT-MTJ synapses can achieve competitive accuracies for MNIST handwritten digit recognition. By appropriately applying neuromorphic principles through hardware-aware design, the proposed STT-MTJ neuromorphic learning networks provide a pathway toward artificial intelligence hardware that learns autonomously with extreme efficiency.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信