{"title":"Reservoir direct feedback alignment: deep learning by physical dynamics","authors":"Mitsumasa Nakajima, Yongbo Zhang, Katsuma Inoue, Yasuo Kuniyoshi, Toshikazu Hashimoto, Kohei Nakajima","doi":"10.1038/s42005-024-01895-0","DOIUrl":null,"url":null,"abstract":"The rapid advancement of deep learning has motivated various analog computing devices for energy-efficient non-von Neuman computing. While recent demonstrations have shown their excellent performance, particularly in the inference phase, computation of training using analog hardware is still challenging due to the complexity of training algorithms such as backpropagation. Here, we present an alternative training algorithm that combines two emerging concepts: reservoir computing (RC) and biologically inspired training. Instead of backpropagated errors, the proposed method computes the error projection using nonlinear dynamics (i.e., reservoir), which is highly suitable for physical implementation because it only requires a single passive dynamical system with a smaller number of nodes. Numerical simulation with Lyapunov analysis showed some interesting features of our proposed algorithm itself: the reservoir basically should be selected to satisfy the echo-state-property; but even chaotic dynamics can be used for the training when its time scale is below the Lyapunov time; and the performance is maximized near the edge of chaos, which is similar to standard RC framework. Furthermore, we experimentally demonstrated the training of feedforward neural networks by using an optoelectronic reservoir computer. Our approach provides an alternative solution for deep learning computation and its physical acceleration. Existing training algorithms for deep neural networks are not suitable for energy-efficient analog hardware. Here, the authors propose and experimentally demonstrate an alternative training algorithm based on reservoir computing, which improves training efficiency in optoelectronic implementations.","PeriodicalId":10540,"journal":{"name":"Communications Physics","volume":" ","pages":"1-10"},"PeriodicalIF":5.4000,"publicationDate":"2024-12-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.com/articles/s42005-024-01895-0.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Communications Physics","FirstCategoryId":"101","ListUrlMain":"https://www.nature.com/articles/s42005-024-01895-0","RegionNum":1,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PHYSICS, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
The rapid advancement of deep learning has motivated various analog computing devices for energy-efficient non-von Neuman computing. While recent demonstrations have shown their excellent performance, particularly in the inference phase, computation of training using analog hardware is still challenging due to the complexity of training algorithms such as backpropagation. Here, we present an alternative training algorithm that combines two emerging concepts: reservoir computing (RC) and biologically inspired training. Instead of backpropagated errors, the proposed method computes the error projection using nonlinear dynamics (i.e., reservoir), which is highly suitable for physical implementation because it only requires a single passive dynamical system with a smaller number of nodes. Numerical simulation with Lyapunov analysis showed some interesting features of our proposed algorithm itself: the reservoir basically should be selected to satisfy the echo-state-property; but even chaotic dynamics can be used for the training when its time scale is below the Lyapunov time; and the performance is maximized near the edge of chaos, which is similar to standard RC framework. Furthermore, we experimentally demonstrated the training of feedforward neural networks by using an optoelectronic reservoir computer. Our approach provides an alternative solution for deep learning computation and its physical acceleration. Existing training algorithms for deep neural networks are not suitable for energy-efficient analog hardware. Here, the authors propose and experimentally demonstrate an alternative training algorithm based on reservoir computing, which improves training efficiency in optoelectronic implementations.
期刊介绍:
Communications Physics is an open access journal from Nature Research publishing high-quality research, reviews and commentary in all areas of the physical sciences. Research papers published by the journal represent significant advances bringing new insight to a specialized area of research in physics. We also aim to provide a community forum for issues of importance to all physicists, regardless of sub-discipline.
The scope of the journal covers all areas of experimental, applied, fundamental, and interdisciplinary physical sciences. Primary research published in Communications Physics includes novel experimental results, new techniques or computational methods that may influence the work of others in the sub-discipline. We also consider submissions from adjacent research fields where the central advance of the study is of interest to physicists, for example material sciences, physical chemistry and technologies.