Luca Manneschi,Ian T Vidamour,Kilian D Stenning,Charles Swindells,Guru Venkat,David Griffin,Lai Gui,Daanish Sonawala,Denis Donskikh,Dana Hariga,Elisa Donati,Susan Stepney,Will R Branford,Jack C Gartside,Thomas J Hayward,Matthew O A Ellis,Eleni Vasilaki
{"title":"神经形态动态设备网络的噪声感知训练。","authors":"Luca Manneschi,Ian T Vidamour,Kilian D Stenning,Charles Swindells,Guru Venkat,David Griffin,Lai Gui,Daanish Sonawala,Denis Donskikh,Dana Hariga,Elisa Donati,Susan Stepney,Will R Branford,Jack C Gartside,Thomas J Hayward,Matthew O A Ellis,Eleni Vasilaki","doi":"10.1038/s41467-025-64232-1","DOIUrl":null,"url":null,"abstract":"In materio computing offers the potential for widespread embodied intelligence by leveraging the intrinsic dynamics of complex systems for efficient sensing, processing, and interaction. While individual devices offer basic data processing capabilities, networks of interconnected devices can perform more complex and varied tasks. However, designing such networks for dynamic tasks is challenging in the absence of physical models and accurate characterization of device noise. We introduce the Noise-Aware Dynamic Optimization (NADO) framework for training networks of dynamical devices, using Neural Stochastic Differential Equations (Neural-SDEs) as differentiable digital twins to capture both the dynamics and stochasticity of devices with intrinsic memory. Our approach combines backpropagation through time with cascade learning, enabling effective exploitation of the temporal properties of physical devices. We validate this method on networks of spintronic devices across both temporal classification and regression tasks. By decoupling device model training from network connectivity optimization, our framework reduces data requirements and enables robust, gradient-based programming of dynamical devices without requiring analytical descriptions of their behaviour.","PeriodicalId":19066,"journal":{"name":"Nature Communications","volume":"104 1","pages":"9192"},"PeriodicalIF":15.7000,"publicationDate":"2025-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Noise-aware training of neuromorphic dynamic device networks.\",\"authors\":\"Luca Manneschi,Ian T Vidamour,Kilian D Stenning,Charles Swindells,Guru Venkat,David Griffin,Lai Gui,Daanish Sonawala,Denis Donskikh,Dana Hariga,Elisa Donati,Susan Stepney,Will R Branford,Jack C Gartside,Thomas J Hayward,Matthew O A Ellis,Eleni Vasilaki\",\"doi\":\"10.1038/s41467-025-64232-1\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In materio computing offers the potential for widespread embodied intelligence by leveraging the intrinsic dynamics of complex systems for efficient sensing, processing, and interaction. While individual devices offer basic data processing capabilities, networks of interconnected devices can perform more complex and varied tasks. However, designing such networks for dynamic tasks is challenging in the absence of physical models and accurate characterization of device noise. We introduce the Noise-Aware Dynamic Optimization (NADO) framework for training networks of dynamical devices, using Neural Stochastic Differential Equations (Neural-SDEs) as differentiable digital twins to capture both the dynamics and stochasticity of devices with intrinsic memory. Our approach combines backpropagation through time with cascade learning, enabling effective exploitation of the temporal properties of physical devices. We validate this method on networks of spintronic devices across both temporal classification and regression tasks. By decoupling device model training from network connectivity optimization, our framework reduces data requirements and enables robust, gradient-based programming of dynamical devices without requiring analytical descriptions of their behaviour.\",\"PeriodicalId\":19066,\"journal\":{\"name\":\"Nature Communications\",\"volume\":\"104 1\",\"pages\":\"9192\"},\"PeriodicalIF\":15.7000,\"publicationDate\":\"2025-10-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Nature Communications\",\"FirstCategoryId\":\"103\",\"ListUrlMain\":\"https://doi.org/10.1038/s41467-025-64232-1\",\"RegionNum\":1,\"RegionCategory\":\"综合性期刊\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MULTIDISCIPLINARY SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Nature Communications","FirstCategoryId":"103","ListUrlMain":"https://doi.org/10.1038/s41467-025-64232-1","RegionNum":1,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
Noise-aware training of neuromorphic dynamic device networks.
In materio computing offers the potential for widespread embodied intelligence by leveraging the intrinsic dynamics of complex systems for efficient sensing, processing, and interaction. While individual devices offer basic data processing capabilities, networks of interconnected devices can perform more complex and varied tasks. However, designing such networks for dynamic tasks is challenging in the absence of physical models and accurate characterization of device noise. We introduce the Noise-Aware Dynamic Optimization (NADO) framework for training networks of dynamical devices, using Neural Stochastic Differential Equations (Neural-SDEs) as differentiable digital twins to capture both the dynamics and stochasticity of devices with intrinsic memory. Our approach combines backpropagation through time with cascade learning, enabling effective exploitation of the temporal properties of physical devices. We validate this method on networks of spintronic devices across both temporal classification and regression tasks. By decoupling device model training from network connectivity optimization, our framework reduces data requirements and enables robust, gradient-based programming of dynamical devices without requiring analytical descriptions of their behaviour.
期刊介绍:
Nature Communications, an open-access journal, publishes high-quality research spanning all areas of the natural sciences. Papers featured in the journal showcase significant advances relevant to specialists in each respective field. With a 2-year impact factor of 16.6 (2022) and a median time of 8 days from submission to the first editorial decision, Nature Communications is committed to rapid dissemination of research findings. As a multidisciplinary journal, it welcomes contributions from biological, health, physical, chemical, Earth, social, mathematical, applied, and engineering sciences, aiming to highlight important breakthroughs within each domain.