On the preprocessing of physics-informed neural networks: How to better utilize data in fluid mechanics

IF 3.8 2区 物理与天体物理 Q2 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
Shengfeng Xu , Yuanjun Dai , Chang Yan , Zhenxu Sun , Renfang Huang , Dilong Guo , Guowei Yang
{"title":"On the preprocessing of physics-informed neural networks: How to better utilize data in fluid mechanics","authors":"Shengfeng Xu ,&nbsp;Yuanjun Dai ,&nbsp;Chang Yan ,&nbsp;Zhenxu Sun ,&nbsp;Renfang Huang ,&nbsp;Dilong Guo ,&nbsp;Guowei Yang","doi":"10.1016/j.jcp.2025.113837","DOIUrl":null,"url":null,"abstract":"<div><div>Physics-Informed Neural Networks (PINNs) serve as a flexible alternative for tackling forward and inverse problems in differential equations, displaying impressive advancements in diverse areas of applied mathematics. Despite integrating both data and underlying physics to enrich the neural network's understanding, concerns regarding the effectiveness and practicality of PINNs persist. Over the past few years, extensive efforts in the current literature have been made to enhance this evolving method, by drawing inspiration from both machine learning algorithms and numerical methods. Despite notable progressions in PINNs algorithms, the important and fundamental field of data preprocessing remain unexplored, limiting the applications of PINNs especially in solving inverse problems. Therefore in this paper, a concise yet potent data preprocessing method focusing on data normalization was proposed. By applying a linear transformation to both the data and corresponding equations concurrently, the normalized PINNs approach was evaluated on the task of reconstructing flow fields in four turbulent cases. The results illustrate that by adhering to the data preprocessing procedure, PINNs can robustly achieve higher prediction accuracy for all flow quantities under different hyperparameter setups, without incurring extra computational cost, distinctly improving the utilization of limited training data. Though mainly verified in Navier-Stokes (NS) equations, this method holds potential for application to various other equations.</div></div>","PeriodicalId":352,"journal":{"name":"Journal of Computational Physics","volume":"528 ","pages":"Article 113837"},"PeriodicalIF":3.8000,"publicationDate":"2025-02-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computational Physics","FirstCategoryId":"101","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0021999125001202","RegionNum":2,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

Abstract

Physics-Informed Neural Networks (PINNs) serve as a flexible alternative for tackling forward and inverse problems in differential equations, displaying impressive advancements in diverse areas of applied mathematics. Despite integrating both data and underlying physics to enrich the neural network's understanding, concerns regarding the effectiveness and practicality of PINNs persist. Over the past few years, extensive efforts in the current literature have been made to enhance this evolving method, by drawing inspiration from both machine learning algorithms and numerical methods. Despite notable progressions in PINNs algorithms, the important and fundamental field of data preprocessing remain unexplored, limiting the applications of PINNs especially in solving inverse problems. Therefore in this paper, a concise yet potent data preprocessing method focusing on data normalization was proposed. By applying a linear transformation to both the data and corresponding equations concurrently, the normalized PINNs approach was evaluated on the task of reconstructing flow fields in four turbulent cases. The results illustrate that by adhering to the data preprocessing procedure, PINNs can robustly achieve higher prediction accuracy for all flow quantities under different hyperparameter setups, without incurring extra computational cost, distinctly improving the utilization of limited training data. Though mainly verified in Navier-Stokes (NS) equations, this method holds potential for application to various other equations.
物理信息神经网络(PINNs)是解决微分方程正演和反演问题的灵活选择,在应用数学的各个领域都取得了令人瞩目的进展。尽管整合了数据和基础物理学来丰富神经网络的理解,但人们对 PINNs 的有效性和实用性的担忧依然存在。在过去几年中,现有文献通过从机器学习算法和数值方法中汲取灵感,为增强这种不断发展的方法做出了大量努力。尽管 PINNs 算法取得了显著进展,但数据预处理这一重要的基础领域仍未得到开发,限制了 PINNs 的应用,尤其是在解决逆问题方面。因此,本文提出了一种简洁而有效的数据预处理方法,重点是数据归一化。通过同时对数据和相应方程进行线性变换,对归一化 PINNs 方法在四种湍流情况下重建流场的任务进行了评估。结果表明,通过遵守数据预处理程序,归一化 PINNs 可以在不同超参数设置下稳健地获得更高的所有流动量预测精度,而不会产生额外的计算成本,明显提高了有限训练数据的利用率。虽然该方法主要在纳维-斯托克斯(NS)方程中得到验证,但也有可能应用于其他各种方程。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Journal of Computational Physics
Journal of Computational Physics 物理-计算机:跨学科应用
CiteScore
7.60
自引率
14.60%
发文量
763
审稿时长
5.8 months
期刊介绍: Journal of Computational Physics thoroughly treats the computational aspects of physical problems, presenting techniques for the numerical solution of mathematical equations arising in all areas of physics. The journal seeks to emphasize methods that cross disciplinary boundaries. The Journal of Computational Physics also publishes short notes of 4 pages or less (including figures, tables, and references but excluding title pages). Letters to the Editor commenting on articles already published in this Journal will also be considered. Neither notes nor letters should have an abstract.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信