Daniel A. Serino , Allen Alvarez Loya , J.W. Burby , Ioannis G. Kevrekidis , Qi Tang
{"title":"用于学习奇摄动系统的快慢神经网络","authors":"Daniel A. Serino , Allen Alvarez Loya , J.W. Burby , Ioannis G. Kevrekidis , Qi Tang","doi":"10.1016/j.jcp.2025.114090","DOIUrl":null,"url":null,"abstract":"<div><div>Singularly perturbed dynamical systems play a crucial role in climate dynamics and plasma physics. A powerful and well-known tool to address these systems is the Fenichel normal form, which significantly simplifies fast dynamics near slow manifolds through a transformation. However, this normal form is difficult to realize in conventional numerical algorithms. In this work, we explore an alternative way of realizing it through structure-preserving machine learning. Specifically, a fast-slow neural network (FSNN) is proposed for learning data-driven models of singularly perturbed dynamical systems with dissipative fast timescale dynamics. Our method enforces the existence of a trainable, attracting invariant slow manifold as a hard constraint. Closed-form representation of the slow manifold enables efficient integration on the slow time scale and significantly improves prediction accuracy beyond the training data. We demonstrate the FSNN on examples including the Grad moment system, two-scale Lorenz96 equations, and Abraham-Lorentz dynamics modeling radiation reaction of electrons.</div></div>","PeriodicalId":352,"journal":{"name":"Journal of Computational Physics","volume":"537 ","pages":"Article 114090"},"PeriodicalIF":3.8000,"publicationDate":"2025-05-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Fast-slow neural networks for learning singularly perturbed dynamical systems\",\"authors\":\"Daniel A. Serino , Allen Alvarez Loya , J.W. Burby , Ioannis G. Kevrekidis , Qi Tang\",\"doi\":\"10.1016/j.jcp.2025.114090\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Singularly perturbed dynamical systems play a crucial role in climate dynamics and plasma physics. A powerful and well-known tool to address these systems is the Fenichel normal form, which significantly simplifies fast dynamics near slow manifolds through a transformation. However, this normal form is difficult to realize in conventional numerical algorithms. In this work, we explore an alternative way of realizing it through structure-preserving machine learning. Specifically, a fast-slow neural network (FSNN) is proposed for learning data-driven models of singularly perturbed dynamical systems with dissipative fast timescale dynamics. Our method enforces the existence of a trainable, attracting invariant slow manifold as a hard constraint. Closed-form representation of the slow manifold enables efficient integration on the slow time scale and significantly improves prediction accuracy beyond the training data. We demonstrate the FSNN on examples including the Grad moment system, two-scale Lorenz96 equations, and Abraham-Lorentz dynamics modeling radiation reaction of electrons.</div></div>\",\"PeriodicalId\":352,\"journal\":{\"name\":\"Journal of Computational Physics\",\"volume\":\"537 \",\"pages\":\"Article 114090\"},\"PeriodicalIF\":3.8000,\"publicationDate\":\"2025-05-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Computational Physics\",\"FirstCategoryId\":\"101\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0021999125003730\",\"RegionNum\":2,\"RegionCategory\":\"物理与天体物理\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computational Physics","FirstCategoryId":"101","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0021999125003730","RegionNum":2,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
Fast-slow neural networks for learning singularly perturbed dynamical systems
Singularly perturbed dynamical systems play a crucial role in climate dynamics and plasma physics. A powerful and well-known tool to address these systems is the Fenichel normal form, which significantly simplifies fast dynamics near slow manifolds through a transformation. However, this normal form is difficult to realize in conventional numerical algorithms. In this work, we explore an alternative way of realizing it through structure-preserving machine learning. Specifically, a fast-slow neural network (FSNN) is proposed for learning data-driven models of singularly perturbed dynamical systems with dissipative fast timescale dynamics. Our method enforces the existence of a trainable, attracting invariant slow manifold as a hard constraint. Closed-form representation of the slow manifold enables efficient integration on the slow time scale and significantly improves prediction accuracy beyond the training data. We demonstrate the FSNN on examples including the Grad moment system, two-scale Lorenz96 equations, and Abraham-Lorentz dynamics modeling radiation reaction of electrons.
期刊介绍:
Journal of Computational Physics thoroughly treats the computational aspects of physical problems, presenting techniques for the numerical solution of mathematical equations arising in all areas of physics. The journal seeks to emphasize methods that cross disciplinary boundaries.
The Journal of Computational Physics also publishes short notes of 4 pages or less (including figures, tables, and references but excluding title pages). Letters to the Editor commenting on articles already published in this Journal will also be considered. Neither notes nor letters should have an abstract.