Decoupling Strategy to Separate Training and Inference with Three-Dimensional Neuromorphic Hardware Composed of Neurons and Hybrid Synapses

IF 15.8 1区 材料科学 Q1 CHEMISTRY, MULTIDISCIPLINARY
ACS Nano Pub Date : 2025-03-26 DOI:10.1021/acsnano.4c18145
Jung-Woo Lee, See-On Park, Seong-Yun Yun, Yeeun Kim, Hyun Myung, Shinhyun Choi, Yang-Kyu Choi
{"title":"Decoupling Strategy to Separate Training and Inference with Three-Dimensional Neuromorphic Hardware Composed of Neurons and Hybrid Synapses","authors":"Jung-Woo Lee, See-On Park, Seong-Yun Yun, Yeeun Kim, Hyun Myung, Shinhyun Choi, Yang-Kyu Choi","doi":"10.1021/acsnano.4c18145","DOIUrl":null,"url":null,"abstract":"Monolithic 3D integration of neuron and synapse devices is considered a promising solution for energy-efficient and compact neuromorphic hardware. However, achieving optimal performance in both training and inference remains challenging as these processes require different synapse devices with reliable endurance and long retention. Here, we introduce a decoupling strategy to separate training and inference using monolithically integrated neuromorphic hardware with layer-by-layer fabrication. This 3D neuromorphic hardware includes neurons consisting of a single transistor (1T-neuron) in the first layer, long-term operational synapses composed of a single thin-film transistor with a SONOS structure (1TFT-synapses) in the second layer for inference, and durable synapses composed of a memristor (1M-synapses) in the third layer for training. A 1TFT-synapse, utilizing a charge-trap layer, exhibits long retention properties favorable for inference tasks. In contrast, a 1M-synapse, leveraging anion movement at the interface, demonstrates robust endurance for repetitive weight updates during training. With the proposed hybrid synapse architecture, frequent training can be performed using the 1M-synapses with robust endurance, while intermittent inference can be managed using the 1TFT-synapses with long-term retention. This decoupling of synaptic functions is advantageous for achieving a reliable spiking neural network (SNN) in neuromorphic computing.","PeriodicalId":21,"journal":{"name":"ACS Nano","volume":"35 1","pages":""},"PeriodicalIF":15.8000,"publicationDate":"2025-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Nano","FirstCategoryId":"88","ListUrlMain":"https://doi.org/10.1021/acsnano.4c18145","RegionNum":1,"RegionCategory":"材料科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"CHEMISTRY, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

Abstract

Monolithic 3D integration of neuron and synapse devices is considered a promising solution for energy-efficient and compact neuromorphic hardware. However, achieving optimal performance in both training and inference remains challenging as these processes require different synapse devices with reliable endurance and long retention. Here, we introduce a decoupling strategy to separate training and inference using monolithically integrated neuromorphic hardware with layer-by-layer fabrication. This 3D neuromorphic hardware includes neurons consisting of a single transistor (1T-neuron) in the first layer, long-term operational synapses composed of a single thin-film transistor with a SONOS structure (1TFT-synapses) in the second layer for inference, and durable synapses composed of a memristor (1M-synapses) in the third layer for training. A 1TFT-synapse, utilizing a charge-trap layer, exhibits long retention properties favorable for inference tasks. In contrast, a 1M-synapse, leveraging anion movement at the interface, demonstrates robust endurance for repetitive weight updates during training. With the proposed hybrid synapse architecture, frequent training can be performed using the 1M-synapses with robust endurance, while intermittent inference can be managed using the 1TFT-synapses with long-term retention. This decoupling of synaptic functions is advantageous for achieving a reliable spiking neural network (SNN) in neuromorphic computing.

Abstract Image

求助全文
约1分钟内获得全文 求助全文
来源期刊
ACS Nano
ACS Nano 工程技术-材料科学:综合
CiteScore
26.00
自引率
4.10%
发文量
1627
审稿时长
1.7 months
期刊介绍: ACS Nano, published monthly, serves as an international forum for comprehensive articles on nanoscience and nanotechnology research at the intersections of chemistry, biology, materials science, physics, and engineering. The journal fosters communication among scientists in these communities, facilitating collaboration, new research opportunities, and advancements through discoveries. ACS Nano covers synthesis, assembly, characterization, theory, and simulation of nanostructures, nanobiotechnology, nanofabrication, methods and tools for nanoscience and nanotechnology, and self- and directed-assembly. Alongside original research articles, it offers thorough reviews, perspectives on cutting-edge research, and discussions envisioning the future of nanoscience and nanotechnology.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信