{"title":"A reduced-form multigrid approach for ANN equivalent to classic multigrid expansion","authors":"Jeong-Kweon Seo","doi":"10.1007/s00521-024-10311-1","DOIUrl":null,"url":null,"abstract":"<p>In this paper, we investigate the method of solving partial differential equations (PDEs) using artificial neural network (ANN) structures, which have been actively applied in artificial intelligence models. The ANN model for solving PDEs offers the advantage of providing explicit and continuous solutions. However, the ANN model for solving PDEs cannot construct a conventionally solvable linear system with known matrix solvers; thus, computational speed could be a significant concern. We study the implementation of the multigrid method, developing a general concept for a coarse-grid correction method to be integrated into the ANN-PDE architecture, with the goal of enhancing computational efficiency. By developing a reduced form of the multigrid method for ANN, we demonstrate that it can be interpreted as an equivalent representation of the classic multigrid expansion. We validated the applicability of the proposed method through rigorous experiments, which included analyzing loss decay and the number of iterations along with improvements in terms of accuracy, speed, and complexity. We accomplished this by employing the gradient descent method and the Broyden–Fletcher–Goldfarb–Shanno (BFGS) method to update the gradients while solving the given ANN systems of PDEs.</p>","PeriodicalId":18925,"journal":{"name":"Neural Computing and Applications","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Computing and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s00521-024-10311-1","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In this paper, we investigate the method of solving partial differential equations (PDEs) using artificial neural network (ANN) structures, which have been actively applied in artificial intelligence models. The ANN model for solving PDEs offers the advantage of providing explicit and continuous solutions. However, the ANN model for solving PDEs cannot construct a conventionally solvable linear system with known matrix solvers; thus, computational speed could be a significant concern. We study the implementation of the multigrid method, developing a general concept for a coarse-grid correction method to be integrated into the ANN-PDE architecture, with the goal of enhancing computational efficiency. By developing a reduced form of the multigrid method for ANN, we demonstrate that it can be interpreted as an equivalent representation of the classic multigrid expansion. We validated the applicability of the proposed method through rigorous experiments, which included analyzing loss decay and the number of iterations along with improvements in terms of accuracy, speed, and complexity. We accomplished this by employing the gradient descent method and the Broyden–Fletcher–Goldfarb–Shanno (BFGS) method to update the gradients while solving the given ANN systems of PDEs.