{"title":"An adaptive strategy for sequential designs of multilevel computer experiments","authors":"Ayao Ehara, S. Guillas","doi":"10.1615/int.j.uncertaintyquantification.2023038376","DOIUrl":null,"url":null,"abstract":"Investigating uncertainties in computer simulations can be prohibitive in terms of computational costs, since the simulator needs to be run over a large number of input values. Building an emulator, i.e. a statistical surrogate model of the simulator constructed using a design of experiments made of a comparatively small number of evaluations of the forward solver, greatly alleviates the computational burden to carry out such investigations. Nevertheless, this can still be above the computational budget for many studies. Two major approaches have been used to reduce the budget needed to build the emulator: efficient design of experiments, such as sequential designs, and combining training data of different degrees of sophistication in a so-called multi-fidelity method, or multilevel in case these fidelities are ordered typically for increasing resolutions. We present here a novel method that combines both approaches, the multilevel adaptive sequential design of computer experiments (MLASCE) in the framework of Gaussian process (GP) emulators. We make use of reproducing kernel Hilbert spaces as a tool for our GP approximations of the differences between two consecutive levels. This dual strategy allows us to allocate efficiently limited computational resources over simulations of different levels of fidelity and build the GP emulator. The allocation of computational resources is shown to be the solution of a simple optimization problem in a special case where we theoretically prove the validity of our approach. Our proposed method is compared with other existing models of multi-fidelity Gaussian process emulation. Gains in orders of magnitudes in accuracy or computing budgets are demonstrated in some of numerical examples for some settings.","PeriodicalId":48814,"journal":{"name":"International Journal for Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":1.5000,"publicationDate":"2021-04-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal for Uncertainty Quantification","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1615/int.j.uncertaintyquantification.2023038376","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 7
Abstract
Investigating uncertainties in computer simulations can be prohibitive in terms of computational costs, since the simulator needs to be run over a large number of input values. Building an emulator, i.e. a statistical surrogate model of the simulator constructed using a design of experiments made of a comparatively small number of evaluations of the forward solver, greatly alleviates the computational burden to carry out such investigations. Nevertheless, this can still be above the computational budget for many studies. Two major approaches have been used to reduce the budget needed to build the emulator: efficient design of experiments, such as sequential designs, and combining training data of different degrees of sophistication in a so-called multi-fidelity method, or multilevel in case these fidelities are ordered typically for increasing resolutions. We present here a novel method that combines both approaches, the multilevel adaptive sequential design of computer experiments (MLASCE) in the framework of Gaussian process (GP) emulators. We make use of reproducing kernel Hilbert spaces as a tool for our GP approximations of the differences between two consecutive levels. This dual strategy allows us to allocate efficiently limited computational resources over simulations of different levels of fidelity and build the GP emulator. The allocation of computational resources is shown to be the solution of a simple optimization problem in a special case where we theoretically prove the validity of our approach. Our proposed method is compared with other existing models of multi-fidelity Gaussian process emulation. Gains in orders of magnitudes in accuracy or computing budgets are demonstrated in some of numerical examples for some settings.
期刊介绍:
The International Journal for Uncertainty Quantification disseminates information of permanent interest in the areas of analysis, modeling, design and control of complex systems in the presence of uncertainty. The journal seeks to emphasize methods that cross stochastic analysis, statistical modeling and scientific computing. Systems of interest are governed by differential equations possibly with multiscale features. Topics of particular interest include representation of uncertainty, propagation of uncertainty across scales, resolving the curse of dimensionality, long-time integration for stochastic PDEs, data-driven approaches for constructing stochastic models, validation, verification and uncertainty quantification for predictive computational science, and visualization of uncertainty in high-dimensional spaces. Bayesian computation and machine learning techniques are also of interest for example in the context of stochastic multiscale systems, for model selection/classification, and decision making. Reports addressing the dynamic coupling of modern experiments and modeling approaches towards predictive science are particularly encouraged. Applications of uncertainty quantification in all areas of physical and biological sciences are appropriate.