Highly Tunable Synaptic Modulation in Photo-Activated Remote Charge Trap Memory for Hardware-Based Fault-Tolerant Learning.

IF 26.8 1区 材料科学 Q1 CHEMISTRY, MULTIDISCIPLINARY
Je-Jun Lee,Hojin Choi,Ju-Hee Lee,Jiwon Moon,Taehyuk Jang,Byoung-Soo Yu,Sang Yeon Kim,Jeong-Ick Cho,Seong-Jun Han,Hyung-Jun Kim,Do Kyung Hwang,Seyong Oh,Jin-Hong Park
{"title":"Highly Tunable Synaptic Modulation in Photo-Activated Remote Charge Trap Memory for Hardware-Based Fault-Tolerant Learning.","authors":"Je-Jun Lee,Hojin Choi,Ju-Hee Lee,Jiwon Moon,Taehyuk Jang,Byoung-Soo Yu,Sang Yeon Kim,Jeong-Ick Cho,Seong-Jun Han,Hyung-Jun Kim,Do Kyung Hwang,Seyong Oh,Jin-Hong Park","doi":"10.1002/adma.202515140","DOIUrl":null,"url":null,"abstract":"The rapid expansion of deep learning applications for unstructured data analysis has led to a substantial increase in energy consumption. This increase is primarily due to matrix-vector multiplication operations, which dominate the energy usage during inference. Although in-memory computing technologies have alleviated some inefficiencies caused by parallel computing, they still face challenges with broader computational algorithms required for advanced deep learning models. In real-world data collection scenarios, datasets often contain \"noisy labels\" (errors in annotations), which cause recognition inefficiencies in conventional in-memory computing. Here, a hardware-based fault-tolerant learning algorithm designed for artificial synapses with tunable synaptic operation is proposed. In this scheme, the devices simultaneously process both learning and regulatory signals, enabling selective attenuation of weight updates induced by mistraining signals. Utilizing a high synaptic tunability ratio of 4380 realized in photo-activated remote charge trap memory devices based on defect-engineered hexagonal boron nitride(h-BN), the system nearly completely suppresses weight update signals from mislabeled data, which leads to improved recognition accuracy on the mislabeled Modified National Institute of Standards and Technology (MNIST) dataset. These results demonstrate that tunable synaptic devices can enhance training efficiency in in-memory computing systems for mislabeled datasets, thereby reducing the need for extensive data cleansing and preparation.","PeriodicalId":114,"journal":{"name":"Advanced Materials","volume":"39 1","pages":"e15140"},"PeriodicalIF":26.8000,"publicationDate":"2025-10-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advanced Materials","FirstCategoryId":"88","ListUrlMain":"https://doi.org/10.1002/adma.202515140","RegionNum":1,"RegionCategory":"材料科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"CHEMISTRY, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

Abstract

The rapid expansion of deep learning applications for unstructured data analysis has led to a substantial increase in energy consumption. This increase is primarily due to matrix-vector multiplication operations, which dominate the energy usage during inference. Although in-memory computing technologies have alleviated some inefficiencies caused by parallel computing, they still face challenges with broader computational algorithms required for advanced deep learning models. In real-world data collection scenarios, datasets often contain "noisy labels" (errors in annotations), which cause recognition inefficiencies in conventional in-memory computing. Here, a hardware-based fault-tolerant learning algorithm designed for artificial synapses with tunable synaptic operation is proposed. In this scheme, the devices simultaneously process both learning and regulatory signals, enabling selective attenuation of weight updates induced by mistraining signals. Utilizing a high synaptic tunability ratio of 4380 realized in photo-activated remote charge trap memory devices based on defect-engineered hexagonal boron nitride(h-BN), the system nearly completely suppresses weight update signals from mislabeled data, which leads to improved recognition accuracy on the mislabeled Modified National Institute of Standards and Technology (MNIST) dataset. These results demonstrate that tunable synaptic devices can enhance training efficiency in in-memory computing systems for mislabeled datasets, thereby reducing the need for extensive data cleansing and preparation.
基于硬件容错学习的光激活远程电荷阱存储器的高可调突触调制。
用于非结构化数据分析的深度学习应用的快速扩展导致了能源消耗的大幅增加。这种增加主要是由于矩阵-向量乘法运算,它在推理期间支配着能量使用。尽管内存计算技术已经缓解了并行计算造成的一些低效率,但它们仍然面临着高级深度学习模型所需的更广泛的计算算法的挑战。在现实世界的数据收集场景中,数据集通常包含“噪声标签”(注释中的错误),这会导致传统内存计算中的识别效率低下。本文提出了一种基于硬件的可调人工突触容错学习算法。在该方案中,设备同时处理学习和调节信号,从而可以选择性地衰减由误训练信号引起的权重更新。利用基于缺陷工程六方氮化硼(h-BN)的光激活远程电荷阱存储器件实现的4380的高突触可调性比,该系统几乎完全抑制了来自错误标记数据的权重更新信号,从而提高了对错误标记修改的美国国家标准与技术研究所(MNIST)数据集的识别精度。这些结果表明,可调突触装置可以提高内存计算系统对错误标记数据集的训练效率,从而减少大量数据清理和准备的需要。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Advanced Materials
Advanced Materials 工程技术-材料科学:综合
CiteScore
43.00
自引率
4.10%
发文量
2182
审稿时长
2 months
期刊介绍: Advanced Materials, one of the world's most prestigious journals and the foundation of the Advanced portfolio, is the home of choice for best-in-class materials science for more than 30 years. Following this fast-growing and interdisciplinary field, we are considering and publishing the most important discoveries on any and all materials from materials scientists, chemists, physicists, engineers as well as health and life scientists and bringing you the latest results and trends in modern materials-related research every week.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信