Total uncertainty quantification in inverse solutions with deep learning surrogate models

IF 3.8 2区 物理与天体物理 Q2 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
Yuanzhe Wang , James L. McCreight , Joseph D. Hughes , Alexandre M. Tartakovsky
{"title":"Total uncertainty quantification in inverse solutions with deep learning surrogate models","authors":"Yuanzhe Wang ,&nbsp;James L. McCreight ,&nbsp;Joseph D. Hughes ,&nbsp;Alexandre M. Tartakovsky","doi":"10.1016/j.jcp.2025.114315","DOIUrl":null,"url":null,"abstract":"<div><div>We propose an approximate Bayesian method for quantifying the total uncertainty in inverse partial differential equation (PDE) solutions obtained with machine learning surrogate models, including operator learning models. The proposed method accounts for uncertainty in the observations, PDE, and surrogate models. First, we use the surrogate model to formulate a minimization problem in the reduced space for the maximum a posteriori (MAP) inverse solution. Then, we randomize the MAP objective function and obtain samples of the posterior distribution by minimizing different realizations of the objective function. We test the proposed framework by comparing it with the iterative ensemble smoother and deep ensembling methods for a nonlinear diffusion equation with an unknown space-dependent diffusion coefficient. Among other applications, this equation describes the flow of groundwater in an unconfined aquifer. Depending on the training dataset and ensemble sizes, the proposed method provides similar or more descriptive posteriors of the parameters and states than the iterative ensemble smoother method. Deep ensembling underestimates uncertainty and provides less-informative posteriors than the other two methods. Our results show that, despite inherent uncertainty, surrogate models can be used for parameter and state estimation as an alternative to the inverse methods relying on (more accurate) numerical PDE solvers.</div></div>","PeriodicalId":352,"journal":{"name":"Journal of Computational Physics","volume":"541 ","pages":"Article 114315"},"PeriodicalIF":3.8000,"publicationDate":"2025-08-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computational Physics","FirstCategoryId":"101","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0021999125005984","RegionNum":2,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

Abstract

We propose an approximate Bayesian method for quantifying the total uncertainty in inverse partial differential equation (PDE) solutions obtained with machine learning surrogate models, including operator learning models. The proposed method accounts for uncertainty in the observations, PDE, and surrogate models. First, we use the surrogate model to formulate a minimization problem in the reduced space for the maximum a posteriori (MAP) inverse solution. Then, we randomize the MAP objective function and obtain samples of the posterior distribution by minimizing different realizations of the objective function. We test the proposed framework by comparing it with the iterative ensemble smoother and deep ensembling methods for a nonlinear diffusion equation with an unknown space-dependent diffusion coefficient. Among other applications, this equation describes the flow of groundwater in an unconfined aquifer. Depending on the training dataset and ensemble sizes, the proposed method provides similar or more descriptive posteriors of the parameters and states than the iterative ensemble smoother method. Deep ensembling underestimates uncertainty and provides less-informative posteriors than the other two methods. Our results show that, despite inherent uncertainty, surrogate models can be used for parameter and state estimation as an alternative to the inverse methods relying on (more accurate) numerical PDE solvers.
用深度学习代理模型量化反解中的总不确定性
我们提出了一种近似贝叶斯方法来量化用机器学习代理模型(包括算子学习模型)获得的逆偏微分方程(PDE)解的总不确定性。该方法考虑了观测值、偏微分方程和替代模型的不确定性。首先,我们使用代理模型在简化空间中为最大后验(MAP)逆解制定最小化问题。然后,我们随机化MAP目标函数,并通过最小化目标函数的不同实现来获得后验分布的样本。我们通过将所提出的框架与具有未知空间依赖扩散系数的非线性扩散方程的迭代集成光滑化和深度集成方法进行比较来验证所提出的框架。在其他应用中,这个方程描述了无承压含水层中地下水的流动。根据训练数据集和集成规模的不同,该方法提供了与迭代集成平滑方法相似或更具描述性的参数和状态后验。与其他两种方法相比,深度集成低估了不确定性,提供的后验信息较少。我们的研究结果表明,尽管存在固有的不确定性,但替代模型可以用于参数和状态估计,作为依赖(更精确的)数值PDE解算器的逆方法的替代方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Journal of Computational Physics
Journal of Computational Physics 物理-计算机:跨学科应用
CiteScore
7.60
自引率
14.60%
发文量
763
审稿时长
5.8 months
期刊介绍: Journal of Computational Physics thoroughly treats the computational aspects of physical problems, presenting techniques for the numerical solution of mathematical equations arising in all areas of physics. The journal seeks to emphasize methods that cross disciplinary boundaries. The Journal of Computational Physics also publishes short notes of 4 pages or less (including figures, tables, and references but excluding title pages). Letters to the Editor commenting on articles already published in this Journal will also be considered. Neither notes nor letters should have an abstract.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信