Shengfeng Xu , Yuanjun Dai , Chang Yan , Zhenxu Sun , Renfang Huang , Dilong Guo , Guowei Yang
{"title":"On the preprocessing of physics-informed neural networks: How to better utilize data in fluid mechanics","authors":"Shengfeng Xu , Yuanjun Dai , Chang Yan , Zhenxu Sun , Renfang Huang , Dilong Guo , Guowei Yang","doi":"10.1016/j.jcp.2025.113837","DOIUrl":null,"url":null,"abstract":"<div><div>Physics-Informed Neural Networks (PINNs) serve as a flexible alternative for tackling forward and inverse problems in differential equations, displaying impressive advancements in diverse areas of applied mathematics. Despite integrating both data and underlying physics to enrich the neural network's understanding, concerns regarding the effectiveness and practicality of PINNs persist. Over the past few years, extensive efforts in the current literature have been made to enhance this evolving method, by drawing inspiration from both machine learning algorithms and numerical methods. Despite notable progressions in PINNs algorithms, the important and fundamental field of data preprocessing remain unexplored, limiting the applications of PINNs especially in solving inverse problems. Therefore in this paper, a concise yet potent data preprocessing method focusing on data normalization was proposed. By applying a linear transformation to both the data and corresponding equations concurrently, the normalized PINNs approach was evaluated on the task of reconstructing flow fields in four turbulent cases. The results illustrate that by adhering to the data preprocessing procedure, PINNs can robustly achieve higher prediction accuracy for all flow quantities under different hyperparameter setups, without incurring extra computational cost, distinctly improving the utilization of limited training data. Though mainly verified in Navier-Stokes (NS) equations, this method holds potential for application to various other equations.</div></div>","PeriodicalId":352,"journal":{"name":"Journal of Computational Physics","volume":"528 ","pages":"Article 113837"},"PeriodicalIF":3.8000,"publicationDate":"2025-02-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computational Physics","FirstCategoryId":"101","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0021999125001202","RegionNum":2,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
On the preprocessing of physics-informed neural networks: How to better utilize data in fluid mechanics
Physics-Informed Neural Networks (PINNs) serve as a flexible alternative for tackling forward and inverse problems in differential equations, displaying impressive advancements in diverse areas of applied mathematics. Despite integrating both data and underlying physics to enrich the neural network's understanding, concerns regarding the effectiveness and practicality of PINNs persist. Over the past few years, extensive efforts in the current literature have been made to enhance this evolving method, by drawing inspiration from both machine learning algorithms and numerical methods. Despite notable progressions in PINNs algorithms, the important and fundamental field of data preprocessing remain unexplored, limiting the applications of PINNs especially in solving inverse problems. Therefore in this paper, a concise yet potent data preprocessing method focusing on data normalization was proposed. By applying a linear transformation to both the data and corresponding equations concurrently, the normalized PINNs approach was evaluated on the task of reconstructing flow fields in four turbulent cases. The results illustrate that by adhering to the data preprocessing procedure, PINNs can robustly achieve higher prediction accuracy for all flow quantities under different hyperparameter setups, without incurring extra computational cost, distinctly improving the utilization of limited training data. Though mainly verified in Navier-Stokes (NS) equations, this method holds potential for application to various other equations.
期刊介绍:
Journal of Computational Physics thoroughly treats the computational aspects of physical problems, presenting techniques for the numerical solution of mathematical equations arising in all areas of physics. The journal seeks to emphasize methods that cross disciplinary boundaries.
The Journal of Computational Physics also publishes short notes of 4 pages or less (including figures, tables, and references but excluding title pages). Letters to the Editor commenting on articles already published in this Journal will also be considered. Neither notes nor letters should have an abstract.