{"title":"Sobol’ sensitivity indices– A Machine Learning approach using the Dynamic Adaptive Variances Estimator with Given Data","authors":"Ivano Azzini, Rossana Rosati","doi":"10.1615/int.j.uncertaintyquantification.2024051654","DOIUrl":"https://doi.org/10.1615/int.j.uncertaintyquantification.2024051654","url":null,"abstract":"Global sensitivity analysis is today a widely recognized discipline with an extensive application in an increasing number of domains. Today, methodological development and available software, as well as a broader knowledge and debate on the topic, make investigations feasible which were simply impossible or too demanding a few years ago.\u0000Among global sensitivity methods, the variance-based techniques and Monte Carlo-based estimators related to Sobol’ sensitivity indices are mostly implemented due to their versatility and easiness of interpretation. Nevertheless, the strict dependency of the analysis cost on the number of the investigated factors and the need of a designed input are still a major issue.\u0000A reduction of the required model evaluations can be achieved with the use of quasi-Monte Carlo sequences, the study of groups of inputs, and the sensitivity indices computation through higher performing estimators such as the Innovative Algorithm based on dynamic adaptive variances recently proposed by the authors. However, all these strategies even cutting significantly the necessary model runs are not able to overcome the barrier of a structured input.\u0000This paper proposes a machine learning approach that allows us to estimate Sobol’ indices using the outstanding dynamic adaptive variances estimator starting from a set of Monte Carlo given data. Tests have been run on three relevant functions. In most cases, the results are very promising and seem to positively overcome the limit of a design-data approach keeping all the advantages of the Sobol’ Monte Carlo estimator.","PeriodicalId":48814,"journal":{"name":"International Journal for Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2024-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142269724","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ilja Kröker, Tim Brünnette, Nils Wildt, Maria Fernanda Morales Oreamuno, Rebecca Kohlhaas, Sergey Oladyshkin, Wolfgang Nowak
{"title":"Bayesian³ Active learning for regularized arbitrary multi-element polynomial chaos using information theory","authors":"Ilja Kröker, Tim Brünnette, Nils Wildt, Maria Fernanda Morales Oreamuno, Rebecca Kohlhaas, Sergey Oladyshkin, Wolfgang Nowak","doi":"10.1615/int.j.uncertaintyquantification.2024052675","DOIUrl":"https://doi.org/10.1615/int.j.uncertaintyquantification.2024052675","url":null,"abstract":"Machine learning, surrogate modeling, and uncertainty quantification pose challenges in data-poor applications that arise due to limited availability of measurement data or with computationally expensive models. Specialized models, derived from Gaussian process emulators (GPE) or polynomial chaos expansions (PCE), are often used when only limited amounts of training points are available. The PCE (or its data-driven version, the arbitrary polynomial chaos) is based on a global representation informed by the distributions of model parameters, whereas GPEs rely on a local kernel and additionally assess the uncertainty of the surrogate itself. Oscillation-mitigating localizations of the PCE result in increased degrees of freedom (DoF), requiring more training samples. As applications such as Bayesian inference (BI) require highly accurate surrogates, even specialized models like PCE or GPE require a substantial amount of training data. Bayesian³ active learning (B³AL) on GPEs, based on information theory (IT), can reduce the necessary number of training samples for BI. IT-based ideas for B³AL are not yet directly transferable to the PCE family, as this family lacks awareness of surrogate uncertainty by design. In the present work, we introduce a Bayesian regularized version of localized arbitrary polynomial chaos to build surrogate models. Equipped with Gaussian emulator properties, our fully adaptive framework is enhanced with B³AL methods designed to achieve reliable surrogate models for BI while efficiently selecting training samples via IT. The effectiveness of the proposed methodology is demonstrated by comprehensive evaluations on several numerical examples.","PeriodicalId":48814,"journal":{"name":"International Journal for Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2024-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142188535","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Wyatt Bridgman, Uma Balakrishnan, Reese E. Jones, Jiefu Chen, Xuqing Wu, Cosmin Safta, Yueqin Huang, Mohammad Khalil
{"title":"A novel probabilistic transfer learning strategy for polynomial regression","authors":"Wyatt Bridgman, Uma Balakrishnan, Reese E. Jones, Jiefu Chen, Xuqing Wu, Cosmin Safta, Yueqin Huang, Mohammad Khalil","doi":"10.1615/int.j.uncertaintyquantification.2024052051","DOIUrl":"https://doi.org/10.1615/int.j.uncertaintyquantification.2024052051","url":null,"abstract":"In the field of surrogate modeling and, more recently, with machine learning, transfer learning methodologies have been proposed in which knowledge from a source task is transferred to a target task where sparse and/or noisy data result in an ill-posed calibration problem. Such sparsity can result from prohibitively expensive forward model simulations or simply lack of data from experiments. Transfer learning attempts to improve target model calibration by leveraging similarities between the source and target tasks.This often takes the form of parameter-based transfer, which exploits correlations between the parameters defining the source and target models in order to regularize the target task. The majority of these approaches are deterministic and do not account for uncertainty in the model parameters. In this work, we propose a novel probabilistic transfer learning methodology which transfers knowledge from the posterior distribution of source to the target Bayesian inverse problem using an approach inspired by data assimilation.While the methodology is presented generally, it is subsequently investigated in the context of polynomial regression and, more specifically, Polynomial Chaos Expansions which result in Gaussian posterior distributions in the case of iid Gaussian observation noise and conjugate Gaussian prior distributions. The strategy is evaluated using numerical investigations and applied to an engineering problem from the oil and gas industry.","PeriodicalId":48814,"journal":{"name":"International Journal for Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2024-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142188534","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Variance-based sensitivity of Bayesian inverse problems to the prior distribution","authors":"John Darges, Alen Alexanderian, Pierre Gremaud","doi":"10.1615/int.j.uncertaintyquantification.2024051475","DOIUrl":"https://doi.org/10.1615/int.j.uncertaintyquantification.2024051475","url":null,"abstract":"The formulation of Bayesian inverse problems involves choosing prior\u0000distributions; choices that seem equally reasonable may lead to significantly\u0000different conclusions. We develop a computational approach to better\u0000understand the impact of the hyperparameters defining the prior on the\u0000posterior statistics of the quantities of interest. Our approach relies on\u0000global sensitivity analysis (GSA) of Bayesian inverse problems with respect to\u0000the hyperparameters defining the prior. This, however, is a challenging\u0000problem---a naive double loop sampling approach would require running a prohibitive\u0000number of Markov chain Monte Carlo (MCMC) sampling procedures. The present\u0000work takes a foundational step in making such a sensitivity analysis practical\u0000through (i) a judicious combination of efficient surrogate models and (ii) a\u0000tailored importance sampling method. In particular, we can perform accurate\u0000GSA of posterior prediction statistics with respect to prior hyperparameters\u0000without having to repeat MCMC runs. We demonstrate the effectiveness of the\u0000approach on a simple Bayesian linear inverse problem and a nonlinear inverse\u0000problem governed by an epidemiological model.","PeriodicalId":48814,"journal":{"name":"International Journal for Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2024-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142188536","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Extremes of vector-valued processes by finite dimensional models","authors":"Hui Xu, Mircea D. Grigoriu","doi":"10.1615/int.j.uncertaintyquantification.2024051826","DOIUrl":"https://doi.org/10.1615/int.j.uncertaintyquantification.2024051826","url":null,"abstract":"Finite dimensional (FD) models, i.e., deterministic functions of time/space and finite sets of random variables, are constructed for target vector-valued random processes/fields. They are required to have two properties. First, standard Monte Carlo algorithms can be used to generate their samples, referred to as FD samples. Second, under some conditions specified by several theorems, FD samples can be used to estimate distributions of extremes and other functionals of target random functions. Numerical illustrations involving two-dimensional random processes and apparent properties of random microstructures are presented to illustrate the implementation of FD models for these stochastic problems and show that they are accurate if the conditions of our theorems are satisfied.","PeriodicalId":48814,"journal":{"name":"International Journal for Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2024-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141942561","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Learning a class of stochastic differential equations via numerics-informed Bayesian denoising","authors":"Zhanpeng Wang, Lijin Wang, Yanzhao Cao","doi":"10.1615/int.j.uncertaintyquantification.2024052020","DOIUrl":"https://doi.org/10.1615/int.j.uncertaintyquantification.2024052020","url":null,"abstract":"Learning stochastic differential equations (SDEs) from observational data via neural networks is an important means of quantifying uncertainty in dynamical systems. The learning networks are typically built upon denoising the stochastic systems by harnessing their inherent deterministic nature, such as the Fokker-Planck equations related to SDEs. In this paper we propose the numerics-informed denoising by taking expectations on the Euler-Maruyama numerical scheme of SDEs, and then using the Bayesian neural networks (BNNs) to approximate the expectations through variational inference on the weights' posterior distribution. The approximation accuracy of the BNNs is analyzed. Meanwhiles we give a data acquisition method for learning non-autonomous differential equations (NADEs) which respects the time-variant nature of NADEs' flows. Numerical experiments on three models show effectiveness of the proposed methods.","PeriodicalId":48814,"journal":{"name":"International Journal for Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141781809","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Covariance estimation using h-statistics in Monte Carlo and multilevel Monte Carlo methods","authors":"Sharana Kumar Shivanand","doi":"10.1615/int.j.uncertaintyquantification.2024051528","DOIUrl":"https://doi.org/10.1615/int.j.uncertaintyquantification.2024051528","url":null,"abstract":"We present novel Monte Carlo (MC) and multilevel Monte Carlo (MLMC) methods to determine the unbiased covariance of random variables using h-statistics. The advantage of this procedure lies in the unbiased construction of the estimator's mean square error in a closed form. This is in contrast to conventional MC and MLMC covariance estimators, which are based on biased mean square errors defined solely by upper bounds, particularly within the MLMC. The numerical results of the algorithms are demonstrated by estimating the covariance of the stochastic response of a simple 1D stochastic elliptic PDE such as Poisson's model.","PeriodicalId":48814,"journal":{"name":"International Journal for Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141587855","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pierre Del Moral, Shulan Hu, Ajay Jasra, Hamza Ruzayqat, Xinyu Wang
{"title":"Bayesian Parameter Inference for Partially Observed Diffusions using Multilevel Stochastic Runge-Kutta Methods","authors":"Pierre Del Moral, Shulan Hu, Ajay Jasra, Hamza Ruzayqat, Xinyu Wang","doi":"10.1615/int.j.uncertaintyquantification.2024051131","DOIUrl":"https://doi.org/10.1615/int.j.uncertaintyquantification.2024051131","url":null,"abstract":"We consider the problem of Bayesian estimation of static parameters associated to a partially and discretely observed diffusion process. We assume that the exact transition dynamics of the diffusion process are unavailable, even up-to an unbiased estimator and that one must time-discretize the diffusion process. In such scenarios it has been shown how one can introduce the multilevel Monte Carlo method to reduce the cost to compute\u0000posterior expected values of the parameters\u0000for a pre-specified mean square error (MSE); see cite{jasra_bpe_sde}. These afore-mentioned methods rely on upon the Euler-Maruyama discretization scheme which is well-known in numerical analysis to have slow convergence properties. We adapt stochastic Runge-Kutta (SRK) methods for\u0000Bayesian parameter estimation of static parameters for diffusions. This\u0000can be implemented in high-dimensions of the diffusion and seemingly under-appreciated in the uncertainty quantification and statistics fields.\u0000For a class of diffusions and SRK methods, we consider the estimation of the posterior expectation of the parameters.\u0000We prove\u0000that to achieve a MSE of $mathcal{O}(epsilon^2)$, for $epsilon>0$ given, the associated work is $mathcal{O}(epsilon^{-2})$.\u0000Whilst the latter is achievable for the Milstein scheme, this method is often not applicable for diffusions in dimension larger than two. We also illustrate our methodology in several numerical examples.","PeriodicalId":48814,"journal":{"name":"International Journal for Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2024-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141505344","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Abhijit Chowdhary, Shanyin Tong, Georg Stadler, Alen Alexanderian
{"title":"Sensitivity Analysis of the Information Gain in Infinite-Dimensional Bayesian Linear Inverse Problems","authors":"Abhijit Chowdhary, Shanyin Tong, Georg Stadler, Alen Alexanderian","doi":"10.1615/int.j.uncertaintyquantification.2024051416","DOIUrl":"https://doi.org/10.1615/int.j.uncertaintyquantification.2024051416","url":null,"abstract":"We study the sensitivity of infinite-dimensional Bayesian linear inverse problems governed by partial differential equations (PDEs) with respect to modeling uncertainties. In particular, we consider derivative-based sensitivity analysis of the information gain, as measured by the Kullback-Leibler divergence from the posterior to the prior distribution. To facilitate this, we develop a fast and accurate method for computing derivatives of the information gain with respect to auxiliary model parameters. Our approach combines low-rank approximations, adjoint-based eigenvalue sensitivity analysis, and post-optimal sensitivity analysis. The proposed approach also paves way for global sensitivity analysis by computing derivative-based global sensitivity measures. We illustrate different aspects of the proposed approach using an inverse problem governed by a scalar linear elliptic PDE, and an inverse problem governed by the three-dimensional equations of linear elasticity, which is motivated by the inversion of the fault-slip field after an earthquake.","PeriodicalId":48814,"journal":{"name":"International Journal for Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2024-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140937158","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Arun Hegde, Elan Weiss, Wolfgang Windl, Habib N. Najm, Cosmin Safta
{"title":"A Bayesian Calibration Framework with Embedded Model Error for Model Diagnostics","authors":"Arun Hegde, Elan Weiss, Wolfgang Windl, Habib N. Najm, Cosmin Safta","doi":"10.1615/int.j.uncertaintyquantification.2024051602","DOIUrl":"https://doi.org/10.1615/int.j.uncertaintyquantification.2024051602","url":null,"abstract":"We study the utility and performance of a Bayesian model error embedding construction in the context of molecular dynamics modeling of metallic alloys, where we embed model error terms in existing interatomic potential model parameters. To alleviate the computational burden of this approach, we propose a framework combining likelihood approximation and Gaussian process surrogates. We leverage sparse Gaussian process techniques to construct a hierarchy of increasingly accurate but more expensive surrogate models. This hierarchy is then exploited by multilevel Markov chain Monte Carlo methods to efficiently sample from the target posterior distribution. We illustrate the utility of this approach by calibrating an interatomic potential model for a family of gold-copper alloys. In particular, this case study highlights effective means for dealing with computational challenges with Bayesian model error embedding in large-scale physical models, and the utility of embedded model error for model diagnostics.","PeriodicalId":48814,"journal":{"name":"International Journal for Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2024-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141153116","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}