{"title":"用于学习任意初始和边界条件下偏微分方程广义解的物理信息变压器神经算子","authors":"Sumanth Kumar Boya, Deepak N. Subramani","doi":"10.1016/j.cpc.2025.109702","DOIUrl":null,"url":null,"abstract":"<div><div>Applications in physics, engineering, mechanics, and fluid dynamics necessitate solving nonlinear partial differential equations (PDEs) with different initial and boundary conditions. Operator learning, an emerging field, solves these PDEs by employing neural networks to map the infinite-dimensional input and output function spaces. These neural operators are trained using data (observations or simulations) and PDE residuals (physics loss). A key limitation of current neural methods is the need to retrain for new initial/boundary conditions and the substantial simulation data required for training. We introduce a physics-informed transformer neural operator (named PINTO) that generalizes efficiently to new conditions, trained solely with physics loss in a simulation-free setting. Our core innovation is the development of iterative kernel integral operator units that use cross-attention to transform domain points of PDE solutions into initial/boundary condition-aware representation vectors, supporting efficient and generalizable learning. The working of PINTO is demonstrated by simulating important 1D and 2D equations used in fluid mechanics, physics and engineering applications: advection, Burgers, and steady and unsteady Navier-Stokes equations (three flow scenarios). We show that under challenging unseen conditions, the relative errors compared to analytical or numerical (finite difference and volume) solutions are low, merely 20% to 33% of those obtained by other leading physics-informed neural operator methods. Furthermore, PINTO accurately solves advection and Burgers equations at time steps not present in the training points, an ability absent for other neural operators. The code is accessible at <span><span>https://github.com/quest-lab-iisc/PINTO</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":285,"journal":{"name":"Computer Physics Communications","volume":"315 ","pages":"Article 109702"},"PeriodicalIF":7.2000,"publicationDate":"2025-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"PINTO: Physics-informed transformer neural operator for learning generalized solutions of partial differential equations for any initial and boundary condition\",\"authors\":\"Sumanth Kumar Boya, Deepak N. Subramani\",\"doi\":\"10.1016/j.cpc.2025.109702\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Applications in physics, engineering, mechanics, and fluid dynamics necessitate solving nonlinear partial differential equations (PDEs) with different initial and boundary conditions. Operator learning, an emerging field, solves these PDEs by employing neural networks to map the infinite-dimensional input and output function spaces. These neural operators are trained using data (observations or simulations) and PDE residuals (physics loss). A key limitation of current neural methods is the need to retrain for new initial/boundary conditions and the substantial simulation data required for training. We introduce a physics-informed transformer neural operator (named PINTO) that generalizes efficiently to new conditions, trained solely with physics loss in a simulation-free setting. Our core innovation is the development of iterative kernel integral operator units that use cross-attention to transform domain points of PDE solutions into initial/boundary condition-aware representation vectors, supporting efficient and generalizable learning. The working of PINTO is demonstrated by simulating important 1D and 2D equations used in fluid mechanics, physics and engineering applications: advection, Burgers, and steady and unsteady Navier-Stokes equations (three flow scenarios). We show that under challenging unseen conditions, the relative errors compared to analytical or numerical (finite difference and volume) solutions are low, merely 20% to 33% of those obtained by other leading physics-informed neural operator methods. Furthermore, PINTO accurately solves advection and Burgers equations at time steps not present in the training points, an ability absent for other neural operators. The code is accessible at <span><span>https://github.com/quest-lab-iisc/PINTO</span><svg><path></path></svg></span>.</div></div>\",\"PeriodicalId\":285,\"journal\":{\"name\":\"Computer Physics Communications\",\"volume\":\"315 \",\"pages\":\"Article 109702\"},\"PeriodicalIF\":7.2000,\"publicationDate\":\"2025-06-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computer Physics Communications\",\"FirstCategoryId\":\"101\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0010465525002048\",\"RegionNum\":2,\"RegionCategory\":\"物理与天体物理\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer Physics Communications","FirstCategoryId":"101","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0010465525002048","RegionNum":2,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
PINTO: Physics-informed transformer neural operator for learning generalized solutions of partial differential equations for any initial and boundary condition
Applications in physics, engineering, mechanics, and fluid dynamics necessitate solving nonlinear partial differential equations (PDEs) with different initial and boundary conditions. Operator learning, an emerging field, solves these PDEs by employing neural networks to map the infinite-dimensional input and output function spaces. These neural operators are trained using data (observations or simulations) and PDE residuals (physics loss). A key limitation of current neural methods is the need to retrain for new initial/boundary conditions and the substantial simulation data required for training. We introduce a physics-informed transformer neural operator (named PINTO) that generalizes efficiently to new conditions, trained solely with physics loss in a simulation-free setting. Our core innovation is the development of iterative kernel integral operator units that use cross-attention to transform domain points of PDE solutions into initial/boundary condition-aware representation vectors, supporting efficient and generalizable learning. The working of PINTO is demonstrated by simulating important 1D and 2D equations used in fluid mechanics, physics and engineering applications: advection, Burgers, and steady and unsteady Navier-Stokes equations (three flow scenarios). We show that under challenging unseen conditions, the relative errors compared to analytical or numerical (finite difference and volume) solutions are low, merely 20% to 33% of those obtained by other leading physics-informed neural operator methods. Furthermore, PINTO accurately solves advection and Burgers equations at time steps not present in the training points, an ability absent for other neural operators. The code is accessible at https://github.com/quest-lab-iisc/PINTO.
期刊介绍:
The focus of CPC is on contemporary computational methods and techniques and their implementation, the effectiveness of which will normally be evidenced by the author(s) within the context of a substantive problem in physics. Within this setting CPC publishes two types of paper.
Computer Programs in Physics (CPiP)
These papers describe significant computer programs to be archived in the CPC Program Library which is held in the Mendeley Data repository. The submitted software must be covered by an approved open source licence. Papers and associated computer programs that address a problem of contemporary interest in physics that cannot be solved by current software are particularly encouraged.
Computational Physics Papers (CP)
These are research papers in, but are not limited to, the following themes across computational physics and related disciplines.
mathematical and numerical methods and algorithms;
computational models including those associated with the design, control and analysis of experiments; and
algebraic computation.
Each will normally include software implementation and performance details. The software implementation should, ideally, be available via GitHub, Zenodo or an institutional repository.In addition, research papers on the impact of advanced computer architecture and special purpose computers on computing in the physical sciences and software topics related to, and of importance in, the physical sciences may be considered.