Eunsuh Kim , Heejae Kwon , Sungha Cho , Kyongmin Yeo , Minseok Choi
{"title":"Stabilize physics-informed neural networks for stiff differential equations: Re-spacing layer","authors":"Eunsuh Kim , Heejae Kwon , Sungha Cho , Kyongmin Yeo , Minseok Choi","doi":"10.1016/j.camwa.2025.09.014","DOIUrl":null,"url":null,"abstract":"<div><div>Approximating the solution of stiff differential equations, which exhibit abrupt changes in certain regions, using physics-informed neural networks (PINNs) is challenging. Typically, training PINNs involves using a larger number of samples concentrated around regions of rapid changes to resolve the sharp gradients. However, this strategy leads to data imbalance, resulting in slower convergence and reduced solution quality. Here, we propose Re-spacing layer (RS-layer) to mitigate these challenges. RS-layer is a pre-trained encoding layer designed to map the skewed distribution of sampling points onto a uniform distribution, maintaining the desirable statistical properties of the input data for effective PINN training. We demonstrate that RS-layer improves PINN training by regularizing the solution gradient in the transformed space. The efficacy of our method is validated through numerical experiments on one-dimensional singularly perturbed equations, the ROBER problem, and the Akzo Nobel problem. Our results show that RS-layer not only accelerates convergence, but also enhances accuracy.</div></div>","PeriodicalId":55218,"journal":{"name":"Computers & Mathematics with Applications","volume":"200 ","pages":"Pages 167-179"},"PeriodicalIF":2.5000,"publicationDate":"2025-09-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers & Mathematics with Applications","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0898122125003955","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
Approximating the solution of stiff differential equations, which exhibit abrupt changes in certain regions, using physics-informed neural networks (PINNs) is challenging. Typically, training PINNs involves using a larger number of samples concentrated around regions of rapid changes to resolve the sharp gradients. However, this strategy leads to data imbalance, resulting in slower convergence and reduced solution quality. Here, we propose Re-spacing layer (RS-layer) to mitigate these challenges. RS-layer is a pre-trained encoding layer designed to map the skewed distribution of sampling points onto a uniform distribution, maintaining the desirable statistical properties of the input data for effective PINN training. We demonstrate that RS-layer improves PINN training by regularizing the solution gradient in the transformed space. The efficacy of our method is validated through numerical experiments on one-dimensional singularly perturbed equations, the ROBER problem, and the Akzo Nobel problem. Our results show that RS-layer not only accelerates convergence, but also enhances accuracy.
期刊介绍:
Computers & Mathematics with Applications provides a medium of exchange for those engaged in fields contributing to building successful simulations for science and engineering using Partial Differential Equations (PDEs).