{"title":"若干椭圆型二阶问题的最小二乘公式,前馈神经网络解及收敛结果","authors":"Jerome Pousin","doi":"10.1016/j.jcmds.2022.100023","DOIUrl":null,"url":null,"abstract":"<div><p>Recently some neural networks have been proposed for computing approximate solutions to partial differential equations. For second order elliptic or parabolic PDEs, this is possible by using penalized Least squares formulations of PDEs. In this article, for some second order elliptic PDEs we propose a theoretical setting, and we investigate the abstract convergence results between the solution and the computed one with neural networks. These results are obtained by minimizing appropriate loss functions made of a least squares formulation of the PDE augmented with a penalization term for accounting the Dirichlet boundary conditions. More precisely, it is shown that the error has two components, one due to the neural network and one due to the way the boundary conditions are imposed (via a penalization technic). The interplay between the two errors shows that the accuracy of the neural network has to be chosen accordingly with the accuracy of the boundary conditions.</p></div>","PeriodicalId":100768,"journal":{"name":"Journal of Computational Mathematics and Data Science","volume":"2 ","pages":"Article 100023"},"PeriodicalIF":0.0000,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2772415822000013/pdfft?md5=1f9531f8460690468524068d54e3add7&pid=1-s2.0-S2772415822000013-main.pdf","citationCount":"3","resultStr":"{\"title\":\"Least squares formulations for some elliptic second order problems, feedforward neural network solutions and convergence results\",\"authors\":\"Jerome Pousin\",\"doi\":\"10.1016/j.jcmds.2022.100023\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Recently some neural networks have been proposed for computing approximate solutions to partial differential equations. For second order elliptic or parabolic PDEs, this is possible by using penalized Least squares formulations of PDEs. In this article, for some second order elliptic PDEs we propose a theoretical setting, and we investigate the abstract convergence results between the solution and the computed one with neural networks. These results are obtained by minimizing appropriate loss functions made of a least squares formulation of the PDE augmented with a penalization term for accounting the Dirichlet boundary conditions. More precisely, it is shown that the error has two components, one due to the neural network and one due to the way the boundary conditions are imposed (via a penalization technic). The interplay between the two errors shows that the accuracy of the neural network has to be chosen accordingly with the accuracy of the boundary conditions.</p></div>\",\"PeriodicalId\":100768,\"journal\":{\"name\":\"Journal of Computational Mathematics and Data Science\",\"volume\":\"2 \",\"pages\":\"Article 100023\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S2772415822000013/pdfft?md5=1f9531f8460690468524068d54e3add7&pid=1-s2.0-S2772415822000013-main.pdf\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Computational Mathematics and Data Science\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2772415822000013\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computational Mathematics and Data Science","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2772415822000013","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Least squares formulations for some elliptic second order problems, feedforward neural network solutions and convergence results
Recently some neural networks have been proposed for computing approximate solutions to partial differential equations. For second order elliptic or parabolic PDEs, this is possible by using penalized Least squares formulations of PDEs. In this article, for some second order elliptic PDEs we propose a theoretical setting, and we investigate the abstract convergence results between the solution and the computed one with neural networks. These results are obtained by minimizing appropriate loss functions made of a least squares formulation of the PDE augmented with a penalization term for accounting the Dirichlet boundary conditions. More precisely, it is shown that the error has two components, one due to the neural network and one due to the way the boundary conditions are imposed (via a penalization technic). The interplay between the two errors shows that the accuracy of the neural network has to be chosen accordingly with the accuracy of the boundary conditions.