Artificial Neurons Based on a Threshold Switching Memristor with Ultralow Threshold Voltage

IF 4.3 3区 材料科学 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC
Huaxian Liang, Ting Jiang, Yu Wang, Le An, Lanxin Bian, Jiacheng Zhou and Baolin Zhang*, 
{"title":"Artificial Neurons Based on a Threshold Switching Memristor with Ultralow Threshold Voltage","authors":"Huaxian Liang,&nbsp;Ting Jiang,&nbsp;Yu Wang,&nbsp;Le An,&nbsp;Lanxin Bian,&nbsp;Jiacheng Zhou and Baolin Zhang*,&nbsp;","doi":"10.1021/acsaelm.5c0018810.1021/acsaelm.5c00188","DOIUrl":null,"url":null,"abstract":"<p >Brain-inspired neuromorphic systems have recently garnered significant interest owing to their ability to effectively overcome the von Neumann bottleneck to increase computing and energy efficiency in the era of the rapid development of artificial intelligence. A hardware artificial neuron with a rectified linear unit (ReLU) activation function is highly desired for introducing a nonlinear activation function and resolving the vanishing gradient problem. In this work, we developed a ReLU artificial neuron based on a threshold switching memristor (TSM) device of Pt/Ag/Al<sub>2</sub>O<sub>3</sub>/HfO<sub>2</sub>/Ag-NIs/Pt structure with an ultralow threshold voltage. This artificial neuron realizes the ReLU activation function by correlating the amplitude of the output spike with the amplitude of the input voltage, which is reported for the first time. To mitigate the potential “dying ReLU” problem that can arise when the ReLU activation function is applied to deep spiking neural networks (SNNs), we developed a LeakyReLU artificial neuron. Experimental results showed that we successfully developed a high-integration and low-power ReLU artificial neuron and its variant, the LeakyReLU artificial neuron, and realized a digital recognition function in a simulated single-layer fully connected SNN, which is of great significance for the construction of large-scale SNNs in the future.</p>","PeriodicalId":3,"journal":{"name":"ACS Applied Electronic Materials","volume":"7 7","pages":"3019–3029 3019–3029"},"PeriodicalIF":4.3000,"publicationDate":"2025-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Applied Electronic Materials","FirstCategoryId":"88","ListUrlMain":"https://pubs.acs.org/doi/10.1021/acsaelm.5c00188","RegionNum":3,"RegionCategory":"材料科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Brain-inspired neuromorphic systems have recently garnered significant interest owing to their ability to effectively overcome the von Neumann bottleneck to increase computing and energy efficiency in the era of the rapid development of artificial intelligence. A hardware artificial neuron with a rectified linear unit (ReLU) activation function is highly desired for introducing a nonlinear activation function and resolving the vanishing gradient problem. In this work, we developed a ReLU artificial neuron based on a threshold switching memristor (TSM) device of Pt/Ag/Al2O3/HfO2/Ag-NIs/Pt structure with an ultralow threshold voltage. This artificial neuron realizes the ReLU activation function by correlating the amplitude of the output spike with the amplitude of the input voltage, which is reported for the first time. To mitigate the potential “dying ReLU” problem that can arise when the ReLU activation function is applied to deep spiking neural networks (SNNs), we developed a LeakyReLU artificial neuron. Experimental results showed that we successfully developed a high-integration and low-power ReLU artificial neuron and its variant, the LeakyReLU artificial neuron, and realized a digital recognition function in a simulated single-layer fully connected SNN, which is of great significance for the construction of large-scale SNNs in the future.

Abstract Image

基于超低阈值电压阈值开关忆阻器的人工神经元
在人工智能飞速发展的时代,大脑启发的神经形态系统能够有效克服冯-诺依曼瓶颈,提高计算和能源效率,因此近来备受关注。为了引入非线性激活函数并解决梯度消失问题,人们非常需要一种具有整流线性单元(ReLU)激活函数的硬件人工神经元。在这项工作中,我们基于具有超低阈值电压的 Pt/Ag/Al2O3/HfO2/Ag-NIs/Pt 结构的阈值开关忆阻器(TSM)器件,开发了一种 ReLU 人工神经元。这种人工神经元通过将输出尖峰的振幅与输入电压的振幅相关联来实现 ReLU 激活函数,这是首次报道。为了缓解将 ReLU 激活函数应用于深度尖峰神经网络(SNN)时可能出现的 "垂死 ReLU "问题,我们开发了一种 LeakyReLU 人工神经元。实验结果表明,我们成功开发了一种高积分、低功耗的 ReLU 人工神经元及其变体 LeakyReLU 人工神经元,并在模拟单层全连接 SNN 中实现了数字识别功能,这对未来构建大规模 SNN 具有重要意义。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
7.20
自引率
4.30%
发文量
567
期刊介绍: ACS Applied Electronic Materials is an interdisciplinary journal publishing original research covering all aspects of electronic materials. The journal is devoted to reports of new and original experimental and theoretical research of an applied nature that integrate knowledge in the areas of materials science, engineering, optics, physics, and chemistry into important applications of electronic materials. Sample research topics that span the journal's scope are inorganic, organic, ionic and polymeric materials with properties that include conducting, semiconducting, superconducting, insulating, dielectric, magnetic, optoelectronic, piezoelectric, ferroelectric and thermoelectric. Indexed/​Abstracted: Web of Science SCIE Scopus CAS INSPEC Portico
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信