Hongkai Ning, Hengdi Wen, Yuan Meng, Zhihao Yu, Yuxiang Fu, Xilu Zou, Yilin Shen, Xiai Luo, Qiyue Zhao, Tao Zhang, Lei Liu, Shitong Zhu, Taotao Li, Weisheng Li, Li Li, Li Gao, Yi Shi, Xinran Wang
{"title":"An index-free sparse neural network using two-dimensional semiconductor ferroelectric field-effect transistors","authors":"Hongkai Ning, Hengdi Wen, Yuan Meng, Zhihao Yu, Yuxiang Fu, Xilu Zou, Yilin Shen, Xiai Luo, Qiyue Zhao, Tao Zhang, Lei Liu, Shitong Zhu, Taotao Li, Weisheng Li, Li Li, Li Gao, Yi Shi, Xinran Wang","doi":"10.1038/s41928-024-01328-4","DOIUrl":null,"url":null,"abstract":"<p>The fine-grained dynamic sparsity in biological synapses is an important element in the energy efficiency of the human brain. Emulating such sparsity in an artificial system requires off-chip memory indexing, which has a considerable energy and latency overhead. Here, we report an in-memory sparsity architecture in which index memory is moved next to individual synapses, creating a sparse neural network without external memory indexing. We use a compact building block consisting of two non-volatile ferroelectric field-effect transistors acting as a digital sparsity and an analogue weight. The network is formulated as the Hadamard product of the sparsity and weight matrices, and the hardware, which is comprised of 900 ferroelectric field-effect transistors, is based on wafer-scale chemical-vapour-deposited molybdenum disulfide integrated through back-end-of-line processes. With the system, we demonstrate key synaptic processes—including pruning, weight update and regrowth—in an unstructured and fine-grained manner. We also develop a vectorial approximate update algorithm and optimize training scheduling. Through this software–hardware co-optimization, we achieve 98.4% accuracy in an EMNIST letter recognition task under 75% sparsity. Simulations on large neural networks show a tenfold reduction in latency and a ninefold reduction in energy consumption when compared with a dense network of the same performance.</p>","PeriodicalId":19064,"journal":{"name":"Nature Electronics","volume":"19 1","pages":""},"PeriodicalIF":33.7000,"publicationDate":"2025-01-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Nature Electronics","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1038/s41928-024-01328-4","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
The fine-grained dynamic sparsity in biological synapses is an important element in the energy efficiency of the human brain. Emulating such sparsity in an artificial system requires off-chip memory indexing, which has a considerable energy and latency overhead. Here, we report an in-memory sparsity architecture in which index memory is moved next to individual synapses, creating a sparse neural network without external memory indexing. We use a compact building block consisting of two non-volatile ferroelectric field-effect transistors acting as a digital sparsity and an analogue weight. The network is formulated as the Hadamard product of the sparsity and weight matrices, and the hardware, which is comprised of 900 ferroelectric field-effect transistors, is based on wafer-scale chemical-vapour-deposited molybdenum disulfide integrated through back-end-of-line processes. With the system, we demonstrate key synaptic processes—including pruning, weight update and regrowth—in an unstructured and fine-grained manner. We also develop a vectorial approximate update algorithm and optimize training scheduling. Through this software–hardware co-optimization, we achieve 98.4% accuracy in an EMNIST letter recognition task under 75% sparsity. Simulations on large neural networks show a tenfold reduction in latency and a ninefold reduction in energy consumption when compared with a dense network of the same performance.
期刊介绍:
Nature Electronics is a comprehensive journal that publishes both fundamental and applied research in the field of electronics. It encompasses a wide range of topics, including the study of new phenomena and devices, the design and construction of electronic circuits, and the practical applications of electronics. In addition, the journal explores the commercial and industrial aspects of electronics research.
The primary focus of Nature Electronics is on the development of technology and its potential impact on society. The journal incorporates the contributions of scientists, engineers, and industry professionals, offering a platform for their research findings. Moreover, Nature Electronics provides insightful commentary, thorough reviews, and analysis of the key issues that shape the field, as well as the technologies that are reshaping society.
Like all journals within the prestigious Nature brand, Nature Electronics upholds the highest standards of quality. It maintains a dedicated team of professional editors and follows a fair and rigorous peer-review process. The journal also ensures impeccable copy-editing and production, enabling swift publication. Additionally, Nature Electronics prides itself on its editorial independence, ensuring unbiased and impartial reporting.
In summary, Nature Electronics is a leading journal that publishes cutting-edge research in electronics. With its multidisciplinary approach and commitment to excellence, the journal serves as a valuable resource for scientists, engineers, and industry professionals seeking to stay at the forefront of advancements in the field.