{"title":"基于深度学习的非稳态边界下渗流方程求解方法","authors":"Daolun Li, Luhang Shen, Wenshu Zha, Shuaijun Lv","doi":"10.1002/fld.5238","DOIUrl":null,"url":null,"abstract":"<p>Deep learning-based methods for solving partial differential equations have become a research hotspot. The approach builds on the previous work of applying deep learning methods to partial differential equations, which avoid the need for meshing and linearization. However, deep learning-based methods face difficulties in effectively solving complex turbulent systems without using labeled data. Moreover, issues such as failure to converge and unstable solution are frequently encountered. In light of this objective, this paper presents an approximation-correction model designed for solving the seepage equation featuring unsteady boundaries. The model consists of two neural networks. The first network acts as an asymptotic block, estimating the progression of the solution based on its asymptotic form. The second network serves to fine-tune any errors identified in the asymptotic block. The solution to the unsteady boundary problem is achieved by superimposing these progressive blocks. In numerical experiments, both a constant flow scenario and a three-stage flow scenario in reservoir exploitation are considered. The obtained results show the method's effectiveness when compared to numerical solutions. Furthermore, the error analysis reveals that this method exhibits superior solution accuracy compared to other baseline methods.</p>","PeriodicalId":50348,"journal":{"name":"International Journal for Numerical Methods in Fluids","volume":null,"pages":null},"PeriodicalIF":1.7000,"publicationDate":"2023-09-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Deep learning-based method for solving seepage equation under unsteady boundary\",\"authors\":\"Daolun Li, Luhang Shen, Wenshu Zha, Shuaijun Lv\",\"doi\":\"10.1002/fld.5238\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Deep learning-based methods for solving partial differential equations have become a research hotspot. The approach builds on the previous work of applying deep learning methods to partial differential equations, which avoid the need for meshing and linearization. However, deep learning-based methods face difficulties in effectively solving complex turbulent systems without using labeled data. Moreover, issues such as failure to converge and unstable solution are frequently encountered. In light of this objective, this paper presents an approximation-correction model designed for solving the seepage equation featuring unsteady boundaries. The model consists of two neural networks. The first network acts as an asymptotic block, estimating the progression of the solution based on its asymptotic form. The second network serves to fine-tune any errors identified in the asymptotic block. The solution to the unsteady boundary problem is achieved by superimposing these progressive blocks. In numerical experiments, both a constant flow scenario and a three-stage flow scenario in reservoir exploitation are considered. The obtained results show the method's effectiveness when compared to numerical solutions. Furthermore, the error analysis reveals that this method exhibits superior solution accuracy compared to other baseline methods.</p>\",\"PeriodicalId\":50348,\"journal\":{\"name\":\"International Journal for Numerical Methods in Fluids\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.7000,\"publicationDate\":\"2023-09-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal for Numerical Methods in Fluids\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1002/fld.5238\",\"RegionNum\":4,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal for Numerical Methods in Fluids","FirstCategoryId":"5","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/fld.5238","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
Deep learning-based method for solving seepage equation under unsteady boundary
Deep learning-based methods for solving partial differential equations have become a research hotspot. The approach builds on the previous work of applying deep learning methods to partial differential equations, which avoid the need for meshing and linearization. However, deep learning-based methods face difficulties in effectively solving complex turbulent systems without using labeled data. Moreover, issues such as failure to converge and unstable solution are frequently encountered. In light of this objective, this paper presents an approximation-correction model designed for solving the seepage equation featuring unsteady boundaries. The model consists of two neural networks. The first network acts as an asymptotic block, estimating the progression of the solution based on its asymptotic form. The second network serves to fine-tune any errors identified in the asymptotic block. The solution to the unsteady boundary problem is achieved by superimposing these progressive blocks. In numerical experiments, both a constant flow scenario and a three-stage flow scenario in reservoir exploitation are considered. The obtained results show the method's effectiveness when compared to numerical solutions. Furthermore, the error analysis reveals that this method exhibits superior solution accuracy compared to other baseline methods.
期刊介绍:
The International Journal for Numerical Methods in Fluids publishes refereed papers describing significant developments in computational methods that are applicable to scientific and engineering problems in fluid mechanics, fluid dynamics, micro and bio fluidics, and fluid-structure interaction. Numerical methods for solving ancillary equations, such as transport and advection and diffusion, are also relevant. The Editors encourage contributions in the areas of multi-physics, multi-disciplinary and multi-scale problems involving fluid subsystems, verification and validation, uncertainty quantification, and model reduction.
Numerical examples that illustrate the described methods or their accuracy are in general expected. Discussions of papers already in print are also considered. However, papers dealing strictly with applications of existing methods or dealing with areas of research that are not deemed to be cutting edge by the Editors will not be considered for review.
The journal publishes full-length papers, which should normally be less than 25 journal pages in length. Two-part papers are discouraged unless considered necessary by the Editors.