哈密顿系统神经网络的广义框架

IF 3.8 2区 物理与天体物理 Q2 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
Philipp Horn , Veronica Saz Ulibarrena , Barry Koren , Simon Portegies Zwart
{"title":"哈密顿系统神经网络的广义框架","authors":"Philipp Horn ,&nbsp;Veronica Saz Ulibarrena ,&nbsp;Barry Koren ,&nbsp;Simon Portegies Zwart","doi":"10.1016/j.jcp.2024.113536","DOIUrl":null,"url":null,"abstract":"<div><div>When solving Hamiltonian systems using numerical integrators, preserving the symplectic structure may be crucial for many problems. At the same time, solving chaotic or stiff problems requires integrators to approximate the trajectories with extreme precision. So, integrating Hamilton's equations to a level of scientific reliability such that the answer can be used for scientific interpretation, may be computationally expensive. However, a neural network can be a viable alternative to numerical integrators, offering high-fidelity solutions orders of magnitudes faster.</div><div>To understand whether it is also important to preserve the symplecticity when neural networks are used, we analyze three well-known neural network architectures that are including the symplectic structure inside the neural network's topology. Between these neural network architectures many similarities can be found. This allows us to formulate a new, generalized framework for these architectures. In the generalized framework Symplectic Recurrent Neural Networks, SympNets and HénonNets are included as special cases. Additionally, this new framework enables us to find novel neural network topologies by transitioning between the established ones.</div><div>We compare new Generalized Hamiltonian Neural Networks (GHNNs) against the already established SympNets, HénonNets and physics-unaware multilayer perceptrons. This comparison is performed with data for a pendulum, a double pendulum and a gravitational 3-body problem. In order to achieve a fair comparison, the hyperparameters of the different neural networks are chosen such that the prediction speeds of all four architectures are the same during inference. A special focus lies on the capability of the neural networks to generalize outside the training data. The GHNNs outperform all other neural network architectures for the problems considered.</div></div>","PeriodicalId":352,"journal":{"name":"Journal of Computational Physics","volume":"521 ","pages":"Article 113536"},"PeriodicalIF":3.8000,"publicationDate":"2024-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A generalized framework of neural networks for Hamiltonian systems\",\"authors\":\"Philipp Horn ,&nbsp;Veronica Saz Ulibarrena ,&nbsp;Barry Koren ,&nbsp;Simon Portegies Zwart\",\"doi\":\"10.1016/j.jcp.2024.113536\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>When solving Hamiltonian systems using numerical integrators, preserving the symplectic structure may be crucial for many problems. At the same time, solving chaotic or stiff problems requires integrators to approximate the trajectories with extreme precision. So, integrating Hamilton's equations to a level of scientific reliability such that the answer can be used for scientific interpretation, may be computationally expensive. However, a neural network can be a viable alternative to numerical integrators, offering high-fidelity solutions orders of magnitudes faster.</div><div>To understand whether it is also important to preserve the symplecticity when neural networks are used, we analyze three well-known neural network architectures that are including the symplectic structure inside the neural network's topology. Between these neural network architectures many similarities can be found. This allows us to formulate a new, generalized framework for these architectures. In the generalized framework Symplectic Recurrent Neural Networks, SympNets and HénonNets are included as special cases. Additionally, this new framework enables us to find novel neural network topologies by transitioning between the established ones.</div><div>We compare new Generalized Hamiltonian Neural Networks (GHNNs) against the already established SympNets, HénonNets and physics-unaware multilayer perceptrons. This comparison is performed with data for a pendulum, a double pendulum and a gravitational 3-body problem. In order to achieve a fair comparison, the hyperparameters of the different neural networks are chosen such that the prediction speeds of all four architectures are the same during inference. A special focus lies on the capability of the neural networks to generalize outside the training data. The GHNNs outperform all other neural network architectures for the problems considered.</div></div>\",\"PeriodicalId\":352,\"journal\":{\"name\":\"Journal of Computational Physics\",\"volume\":\"521 \",\"pages\":\"Article 113536\"},\"PeriodicalIF\":3.8000,\"publicationDate\":\"2024-10-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Computational Physics\",\"FirstCategoryId\":\"101\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0021999124007848\",\"RegionNum\":2,\"RegionCategory\":\"物理与天体物理\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computational Physics","FirstCategoryId":"101","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0021999124007848","RegionNum":2,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

摘要

在使用数值积分器求解哈密顿系统时,保留交映结构可能对许多问题至关重要。同时,解决混沌或僵硬问题需要积分器以极高的精度逼近轨迹。因此,将汉密尔顿方程积分到科学可靠的程度,使答案可用于科学解释,可能会耗费大量计算资源。为了了解在使用神经网络时保持交映性是否同样重要,我们分析了三种著名的神经网络架构,它们都将交映结构包含在神经网络的拓扑结构中。这些神经网络架构之间有许多相似之处。因此,我们可以为这些架构制定一个新的通用框架。在这个广义框架中,交映递归神经网络、SympNets 和 HénonNets 都是特例。我们将新的广义哈密顿神经网络(GHNN)与已建立的 SympNets、HénonNets 和物理无感知多层感知器进行了比较。比较使用了摆锤、双摆锤和重力三体问题的数据。为了进行公平比较,我们选择了不同神经网络的超参数,使所有四种架构在推理过程中的预测速度相同。我们特别关注神经网络在训练数据之外的泛化能力。在所考虑的问题上,GHNNs 的表现优于所有其他神经网络架构。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A generalized framework of neural networks for Hamiltonian systems
When solving Hamiltonian systems using numerical integrators, preserving the symplectic structure may be crucial for many problems. At the same time, solving chaotic or stiff problems requires integrators to approximate the trajectories with extreme precision. So, integrating Hamilton's equations to a level of scientific reliability such that the answer can be used for scientific interpretation, may be computationally expensive. However, a neural network can be a viable alternative to numerical integrators, offering high-fidelity solutions orders of magnitudes faster.
To understand whether it is also important to preserve the symplecticity when neural networks are used, we analyze three well-known neural network architectures that are including the symplectic structure inside the neural network's topology. Between these neural network architectures many similarities can be found. This allows us to formulate a new, generalized framework for these architectures. In the generalized framework Symplectic Recurrent Neural Networks, SympNets and HénonNets are included as special cases. Additionally, this new framework enables us to find novel neural network topologies by transitioning between the established ones.
We compare new Generalized Hamiltonian Neural Networks (GHNNs) against the already established SympNets, HénonNets and physics-unaware multilayer perceptrons. This comparison is performed with data for a pendulum, a double pendulum and a gravitational 3-body problem. In order to achieve a fair comparison, the hyperparameters of the different neural networks are chosen such that the prediction speeds of all four architectures are the same during inference. A special focus lies on the capability of the neural networks to generalize outside the training data. The GHNNs outperform all other neural network architectures for the problems considered.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of Computational Physics
Journal of Computational Physics 物理-计算机:跨学科应用
CiteScore
7.60
自引率
14.60%
发文量
763
审稿时长
5.8 months
期刊介绍: Journal of Computational Physics thoroughly treats the computational aspects of physical problems, presenting techniques for the numerical solution of mathematical equations arising in all areas of physics. The journal seeks to emphasize methods that cross disciplinary boundaries. The Journal of Computational Physics also publishes short notes of 4 pages or less (including figures, tables, and references but excluding title pages). Letters to the Editor commenting on articles already published in this Journal will also be considered. Neither notes nor letters should have an abstract.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信