基于迁移学习的参数优化改进3D NAND性能

IF 2.2 4区 工程技术 Q3 ENGINEERING, ELECTRICAL & ELECTRONIC
Dibyadrasta Sahoo, Ankit Gaurav, Sanjeev Kumar Manhas
{"title":"基于迁移学习的参数优化改进3D NAND性能","authors":"Dibyadrasta Sahoo,&nbsp;Ankit Gaurav,&nbsp;Sanjeev Kumar Manhas","doi":"10.1007/s10825-025-02292-8","DOIUrl":null,"url":null,"abstract":"<div><p>Process variation leads to variability in key device parameters such as plug separation, recess depth, epi-plug doping, and epi-plug height, which play a vital role in 3D NAND performance during scaling. Machine learning (ML) offers an alternate approach to predict and optimize performance by analyzing variable nonlinearity. However, in recent work, device optimization has been done over a narrow range, resulting in local rather than global optima. Additionally, these methods rely on extensive datasets, which increase costs and reduce the practicality of TCAD-ML models. This paper uses transfer learning to optimize the above parameters by integrating a long short-term memory (LSTM) model with the JAYA optimization algorithm. This approach considers a wide range of device parameters for optimization. By training on well-calibrated TCAD-generated data, we achieve an impressive accuracy rate of 98.5% in forecasting the values of threshold voltage (<i>V</i><sub>th</sub>), on current (<i>I</i><sub>on</sub>), subthreshold swing (SS), and transconductance (<i>g</i><sub><i>m</i></sub>). Our results reveal that the LSTM uses fewer datasets and outperforms feedforward neural networks with a performance improvement of 67%. Further, we achieve a mean-squared error of 0.217 using the JAYA optimization algorithm.</p></div>","PeriodicalId":620,"journal":{"name":"Journal of Computational Electronics","volume":"24 2","pages":""},"PeriodicalIF":2.2000,"publicationDate":"2025-02-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Transfer learning-based parameter optimization for improved 3D NAND performance\",\"authors\":\"Dibyadrasta Sahoo,&nbsp;Ankit Gaurav,&nbsp;Sanjeev Kumar Manhas\",\"doi\":\"10.1007/s10825-025-02292-8\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Process variation leads to variability in key device parameters such as plug separation, recess depth, epi-plug doping, and epi-plug height, which play a vital role in 3D NAND performance during scaling. Machine learning (ML) offers an alternate approach to predict and optimize performance by analyzing variable nonlinearity. However, in recent work, device optimization has been done over a narrow range, resulting in local rather than global optima. Additionally, these methods rely on extensive datasets, which increase costs and reduce the practicality of TCAD-ML models. This paper uses transfer learning to optimize the above parameters by integrating a long short-term memory (LSTM) model with the JAYA optimization algorithm. This approach considers a wide range of device parameters for optimization. By training on well-calibrated TCAD-generated data, we achieve an impressive accuracy rate of 98.5% in forecasting the values of threshold voltage (<i>V</i><sub>th</sub>), on current (<i>I</i><sub>on</sub>), subthreshold swing (SS), and transconductance (<i>g</i><sub><i>m</i></sub>). Our results reveal that the LSTM uses fewer datasets and outperforms feedforward neural networks with a performance improvement of 67%. Further, we achieve a mean-squared error of 0.217 using the JAYA optimization algorithm.</p></div>\",\"PeriodicalId\":620,\"journal\":{\"name\":\"Journal of Computational Electronics\",\"volume\":\"24 2\",\"pages\":\"\"},\"PeriodicalIF\":2.2000,\"publicationDate\":\"2025-02-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Computational Electronics\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://link.springer.com/article/10.1007/s10825-025-02292-8\",\"RegionNum\":4,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computational Electronics","FirstCategoryId":"5","ListUrlMain":"https://link.springer.com/article/10.1007/s10825-025-02292-8","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

摘要

工艺变化导致关键器件参数的变化,如插头分离、凹槽深度、外接插头掺杂和外接插头高度,这些参数在缩放过程中对3D NAND性能起着至关重要的作用。机器学习(ML)提供了一种通过分析变量非线性来预测和优化性能的替代方法。然而,在最近的工作中,设备优化是在一个狭窄的范围内进行的,导致局部优化而不是全局优化。此外,这些方法依赖于广泛的数据集,这增加了成本,降低了TCAD-ML模型的实用性。本文将长短期记忆(LSTM)模型与JAYA优化算法相结合,利用迁移学习对上述参数进行优化。这种方法考虑了广泛的设备参数进行优化。通过对校准良好的tcad生成的数据进行训练,我们在预测阈值电压(Vth)、电流(Ion)、亚阈值摆幅(SS)和跨导(gm)方面达到了令人印象深刻的98.5%的准确率。我们的研究结果表明,LSTM使用更少的数据集,性能优于前馈神经网络,提高了67%。此外,我们使用JAYA优化算法实现了0.217的均方误差。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Transfer learning-based parameter optimization for improved 3D NAND performance

Process variation leads to variability in key device parameters such as plug separation, recess depth, epi-plug doping, and epi-plug height, which play a vital role in 3D NAND performance during scaling. Machine learning (ML) offers an alternate approach to predict and optimize performance by analyzing variable nonlinearity. However, in recent work, device optimization has been done over a narrow range, resulting in local rather than global optima. Additionally, these methods rely on extensive datasets, which increase costs and reduce the practicality of TCAD-ML models. This paper uses transfer learning to optimize the above parameters by integrating a long short-term memory (LSTM) model with the JAYA optimization algorithm. This approach considers a wide range of device parameters for optimization. By training on well-calibrated TCAD-generated data, we achieve an impressive accuracy rate of 98.5% in forecasting the values of threshold voltage (Vth), on current (Ion), subthreshold swing (SS), and transconductance (gm). Our results reveal that the LSTM uses fewer datasets and outperforms feedforward neural networks with a performance improvement of 67%. Further, we achieve a mean-squared error of 0.217 using the JAYA optimization algorithm.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of Computational Electronics
Journal of Computational Electronics ENGINEERING, ELECTRICAL & ELECTRONIC-PHYSICS, APPLIED
CiteScore
4.50
自引率
4.80%
发文量
142
审稿时长
>12 weeks
期刊介绍: he Journal of Computational Electronics brings together research on all aspects of modeling and simulation of modern electronics. This includes optical, electronic, mechanical, and quantum mechanical aspects, as well as research on the underlying mathematical algorithms and computational details. The related areas of energy conversion/storage and of molecular and biological systems, in which the thrust is on the charge transport, electronic, mechanical, and optical properties, are also covered. In particular, we encourage manuscripts dealing with device simulation; with optical and optoelectronic systems and photonics; with energy storage (e.g. batteries, fuel cells) and harvesting (e.g. photovoltaic), with simulation of circuits, VLSI layout, logic and architecture (based on, for example, CMOS devices, quantum-cellular automata, QBITs, or single-electron transistors); with electromagnetic simulations (such as microwave electronics and components); or with molecular and biological systems. However, in all these cases, the submitted manuscripts should explicitly address the electronic properties of the relevant systems, materials, or devices and/or present novel contributions to the physical models, computational strategies, or numerical algorithms.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信