神经PDE解算器的逆进化数据增强。

IF 3.7 3区 综合性期刊 Q1 MULTIDISCIPLINARY SCIENCES
Chaoyu Liu, Chris Budd, Carola-Bibiane Schönlieb
{"title":"神经PDE解算器的逆进化数据增强。","authors":"Chaoyu Liu, Chris Budd, Carola-Bibiane Schönlieb","doi":"10.1098/rsta.2024.0242","DOIUrl":null,"url":null,"abstract":"<p><p>Neural networks have emerged as promising tools for solving partial differential equations (PDEs), particularly through the application of neural operators. Training neural operators typically requires a large amount of training data to ensure accuracy and generalization. In this article, we propose a novel data augmentation method specifically designed for training neural operators on evolution equations. Our approach utilizes insights from inverse processes of these equations to efficiently generate data from random initialization that are combined with original data. To further enhance the accuracy of the augmented data, we introduce high-order inverse evolution schemes. These schemes consist of only a few explicit computation steps, yet the resulting data pairs can be proven to satisfy the corresponding implicit numerical schemes. In contrast to traditional PDE solvers that require small time steps or implicit schemes to guarantee accuracy, our data augmentation method employs explicit schemes with relatively large time steps, thereby significantly reducing computational costs. Accuracy and efficacy experiments confirm the effectiveness of our approach. In addition, we validate our approach through experiments with the Fourier neural operator (FNO) and UNet on three common evolution equations: Burgers' equation, the Allen-Cahn equation and the Navier-Stokes equation. The results demonstrate a significant improvement in the performance and robustness of the FNO when coupled with our inverse evolution data augmentation method.This article is part of the theme issue 'Partial differential equations in data science'.</p>","PeriodicalId":19879,"journal":{"name":"Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences","volume":"383 2298","pages":"20240242"},"PeriodicalIF":3.7000,"publicationDate":"2025-06-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Inverse evolution data augmentation for neural PDE solvers.\",\"authors\":\"Chaoyu Liu, Chris Budd, Carola-Bibiane Schönlieb\",\"doi\":\"10.1098/rsta.2024.0242\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Neural networks have emerged as promising tools for solving partial differential equations (PDEs), particularly through the application of neural operators. Training neural operators typically requires a large amount of training data to ensure accuracy and generalization. In this article, we propose a novel data augmentation method specifically designed for training neural operators on evolution equations. Our approach utilizes insights from inverse processes of these equations to efficiently generate data from random initialization that are combined with original data. To further enhance the accuracy of the augmented data, we introduce high-order inverse evolution schemes. These schemes consist of only a few explicit computation steps, yet the resulting data pairs can be proven to satisfy the corresponding implicit numerical schemes. In contrast to traditional PDE solvers that require small time steps or implicit schemes to guarantee accuracy, our data augmentation method employs explicit schemes with relatively large time steps, thereby significantly reducing computational costs. Accuracy and efficacy experiments confirm the effectiveness of our approach. In addition, we validate our approach through experiments with the Fourier neural operator (FNO) and UNet on three common evolution equations: Burgers' equation, the Allen-Cahn equation and the Navier-Stokes equation. The results demonstrate a significant improvement in the performance and robustness of the FNO when coupled with our inverse evolution data augmentation method.This article is part of the theme issue 'Partial differential equations in data science'.</p>\",\"PeriodicalId\":19879,\"journal\":{\"name\":\"Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences\",\"volume\":\"383 2298\",\"pages\":\"20240242\"},\"PeriodicalIF\":3.7000,\"publicationDate\":\"2025-06-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences\",\"FirstCategoryId\":\"103\",\"ListUrlMain\":\"https://doi.org/10.1098/rsta.2024.0242\",\"RegionNum\":3,\"RegionCategory\":\"综合性期刊\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MULTIDISCIPLINARY SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences","FirstCategoryId":"103","ListUrlMain":"https://doi.org/10.1098/rsta.2024.0242","RegionNum":3,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
引用次数: 0

摘要

神经网络已经成为求解偏微分方程(PDEs)的有前途的工具,特别是通过神经算子的应用。训练神经算子通常需要大量的训练数据来保证准确性和泛化。在本文中,我们提出了一种新的数据增强方法,专门用于训练进化方程上的神经算子。我们的方法利用这些方程的逆过程的见解,有效地从随机初始化中生成与原始数据相结合的数据。为了进一步提高增强数据的精度,我们引入了高阶逆演化方案。这些格式仅由几个显式计算步骤组成,但所得到的数据对可以证明满足相应的隐式数值格式。与传统的PDE求解方法需要较小的时间步长或隐式方案来保证精度相比,我们的数据增强方法采用了时间步长相对较大的显式方案,从而显著降低了计算成本。准确性和有效性实验验证了该方法的有效性。此外,我们通过傅立叶神经算子(FNO)和UNet在三个常见进化方程(Burgers’方程、Allen-Cahn方程和Navier-Stokes方程)上的实验验证了我们的方法。结果表明,与我们的逆进化数据增强方法相结合,FNO的性能和鲁棒性得到了显著提高。本文是“数据科学中的偏微分方程”主题的一部分。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Inverse evolution data augmentation for neural PDE solvers.

Neural networks have emerged as promising tools for solving partial differential equations (PDEs), particularly through the application of neural operators. Training neural operators typically requires a large amount of training data to ensure accuracy and generalization. In this article, we propose a novel data augmentation method specifically designed for training neural operators on evolution equations. Our approach utilizes insights from inverse processes of these equations to efficiently generate data from random initialization that are combined with original data. To further enhance the accuracy of the augmented data, we introduce high-order inverse evolution schemes. These schemes consist of only a few explicit computation steps, yet the resulting data pairs can be proven to satisfy the corresponding implicit numerical schemes. In contrast to traditional PDE solvers that require small time steps or implicit schemes to guarantee accuracy, our data augmentation method employs explicit schemes with relatively large time steps, thereby significantly reducing computational costs. Accuracy and efficacy experiments confirm the effectiveness of our approach. In addition, we validate our approach through experiments with the Fourier neural operator (FNO) and UNet on three common evolution equations: Burgers' equation, the Allen-Cahn equation and the Navier-Stokes equation. The results demonstrate a significant improvement in the performance and robustness of the FNO when coupled with our inverse evolution data augmentation method.This article is part of the theme issue 'Partial differential equations in data science'.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
9.30
自引率
2.00%
发文量
367
审稿时长
3 months
期刊介绍: Continuing its long history of influential scientific publishing, Philosophical Transactions A publishes high-quality theme issues on topics of current importance and general interest within the physical, mathematical and engineering sciences, guest-edited by leading authorities and comprising new research, reviews and opinions from prominent researchers.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信