神经形态动态设备网络的噪声感知训练。

IF 15.7 1区 综合性期刊 Q1 MULTIDISCIPLINARY SCIENCES
Luca Manneschi,Ian T Vidamour,Kilian D Stenning,Charles Swindells,Guru Venkat,David Griffin,Lai Gui,Daanish Sonawala,Denis Donskikh,Dana Hariga,Elisa Donati,Susan Stepney,Will R Branford,Jack C Gartside,Thomas J Hayward,Matthew O A Ellis,Eleni Vasilaki
{"title":"神经形态动态设备网络的噪声感知训练。","authors":"Luca Manneschi,Ian T Vidamour,Kilian D Stenning,Charles Swindells,Guru Venkat,David Griffin,Lai Gui,Daanish Sonawala,Denis Donskikh,Dana Hariga,Elisa Donati,Susan Stepney,Will R Branford,Jack C Gartside,Thomas J Hayward,Matthew O A Ellis,Eleni Vasilaki","doi":"10.1038/s41467-025-64232-1","DOIUrl":null,"url":null,"abstract":"In materio computing offers the potential for widespread embodied intelligence by leveraging the intrinsic dynamics of complex systems for efficient sensing, processing, and interaction. While individual devices offer basic data processing capabilities, networks of interconnected devices can perform more complex and varied tasks. However, designing such networks for dynamic tasks is challenging in the absence of physical models and accurate characterization of device noise. We introduce the Noise-Aware Dynamic Optimization (NADO) framework for training networks of dynamical devices, using Neural Stochastic Differential Equations (Neural-SDEs) as differentiable digital twins to capture both the dynamics and stochasticity of devices with intrinsic memory. Our approach combines backpropagation through time with cascade learning, enabling effective exploitation of the temporal properties of physical devices. We validate this method on networks of spintronic devices across both temporal classification and regression tasks. By decoupling device model training from network connectivity optimization, our framework reduces data requirements and enables robust, gradient-based programming of dynamical devices without requiring analytical descriptions of their behaviour.","PeriodicalId":19066,"journal":{"name":"Nature Communications","volume":"104 1","pages":"9192"},"PeriodicalIF":15.7000,"publicationDate":"2025-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Noise-aware training of neuromorphic dynamic device networks.\",\"authors\":\"Luca Manneschi,Ian T Vidamour,Kilian D Stenning,Charles Swindells,Guru Venkat,David Griffin,Lai Gui,Daanish Sonawala,Denis Donskikh,Dana Hariga,Elisa Donati,Susan Stepney,Will R Branford,Jack C Gartside,Thomas J Hayward,Matthew O A Ellis,Eleni Vasilaki\",\"doi\":\"10.1038/s41467-025-64232-1\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In materio computing offers the potential for widespread embodied intelligence by leveraging the intrinsic dynamics of complex systems for efficient sensing, processing, and interaction. While individual devices offer basic data processing capabilities, networks of interconnected devices can perform more complex and varied tasks. However, designing such networks for dynamic tasks is challenging in the absence of physical models and accurate characterization of device noise. We introduce the Noise-Aware Dynamic Optimization (NADO) framework for training networks of dynamical devices, using Neural Stochastic Differential Equations (Neural-SDEs) as differentiable digital twins to capture both the dynamics and stochasticity of devices with intrinsic memory. Our approach combines backpropagation through time with cascade learning, enabling effective exploitation of the temporal properties of physical devices. We validate this method on networks of spintronic devices across both temporal classification and regression tasks. By decoupling device model training from network connectivity optimization, our framework reduces data requirements and enables robust, gradient-based programming of dynamical devices without requiring analytical descriptions of their behaviour.\",\"PeriodicalId\":19066,\"journal\":{\"name\":\"Nature Communications\",\"volume\":\"104 1\",\"pages\":\"9192\"},\"PeriodicalIF\":15.7000,\"publicationDate\":\"2025-10-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Nature Communications\",\"FirstCategoryId\":\"103\",\"ListUrlMain\":\"https://doi.org/10.1038/s41467-025-64232-1\",\"RegionNum\":1,\"RegionCategory\":\"综合性期刊\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MULTIDISCIPLINARY SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Nature Communications","FirstCategoryId":"103","ListUrlMain":"https://doi.org/10.1038/s41467-025-64232-1","RegionNum":1,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
引用次数: 0

摘要

在材料计算中,通过利用复杂系统的内在动态进行有效的感知、处理和交互,为广泛的具身智能提供了潜力。虽然单个设备提供基本的数据处理能力,但相互连接的设备网络可以执行更复杂和各种各样的任务。然而,在缺乏物理模型和准确表征设备噪声的情况下,为动态任务设计这样的网络是具有挑战性的。我们引入了噪声感知动态优化(NADO)框架,用于动态设备的训练网络,使用神经随机微分方程(Neural- sdes)作为可微数字双胞胎来捕捉具有内在记忆的设备的动态和随机性。我们的方法结合了随时间的反向传播和级联学习,能够有效地利用物理设备的时间特性。我们在自旋电子设备网络上验证了这种方法,包括时间分类和回归任务。通过将设备模型训练与网络连接优化解耦,我们的框架减少了数据需求,并实现了动态设备的鲁棒性、基于梯度的编程,而不需要对其行为进行分析描述。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Noise-aware training of neuromorphic dynamic device networks.
In materio computing offers the potential for widespread embodied intelligence by leveraging the intrinsic dynamics of complex systems for efficient sensing, processing, and interaction. While individual devices offer basic data processing capabilities, networks of interconnected devices can perform more complex and varied tasks. However, designing such networks for dynamic tasks is challenging in the absence of physical models and accurate characterization of device noise. We introduce the Noise-Aware Dynamic Optimization (NADO) framework for training networks of dynamical devices, using Neural Stochastic Differential Equations (Neural-SDEs) as differentiable digital twins to capture both the dynamics and stochasticity of devices with intrinsic memory. Our approach combines backpropagation through time with cascade learning, enabling effective exploitation of the temporal properties of physical devices. We validate this method on networks of spintronic devices across both temporal classification and regression tasks. By decoupling device model training from network connectivity optimization, our framework reduces data requirements and enables robust, gradient-based programming of dynamical devices without requiring analytical descriptions of their behaviour.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Nature Communications
Nature Communications Biological Science Disciplines-
CiteScore
24.90
自引率
2.40%
发文量
6928
审稿时长
3.7 months
期刊介绍: Nature Communications, an open-access journal, publishes high-quality research spanning all areas of the natural sciences. Papers featured in the journal showcase significant advances relevant to specialists in each respective field. With a 2-year impact factor of 16.6 (2022) and a median time of 8 days from submission to the first editorial decision, Nature Communications is committed to rapid dissemination of research findings. As a multidisciplinary journal, it welcomes contributions from biological, health, physical, chemical, Earth, social, mathematical, applied, and engineering sciences, aiming to highlight important breakthroughs within each domain.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信