{"title":"A Bayesian neural network approach to Multi-fidelity surrogate modelling","authors":"Baptiste KerleguerDAM/DIF, CMAP, Claire CannamelaDAM/DIF, Josselin GarnierCMAP","doi":"arxiv-2312.02575","DOIUrl":null,"url":null,"abstract":"This paper deals with surrogate modelling of a computer code output in a\nhierarchical multi-fidelity context, i.e., when the output can be evaluated at\ndifferent levels of accuracy and computational cost. Using observations of the\noutput at low- and high-fidelity levels, we propose a method that combines\nGaussian process (GP) regression and Bayesian neural network (BNN), in a method\ncalled GPBNN. The low-fidelity output is treated as a single-fidelity code\nusing classical GP regression. The high-fidelity output is approximated by a\nBNN that incorporates, in addition to the high-fidelity observations,\nwell-chosen realisations of the low-fidelity output emulator. The predictive\nuncertainty of the final surrogate model is then quantified by a complete\ncharacterisation of the uncertainties of the different models and their\ninteraction. GPBNN is compared with most of the multi-fidelity regression\nmethods allowing to quantify the prediction uncertainty.","PeriodicalId":501330,"journal":{"name":"arXiv - MATH - Statistics Theory","volume":"93 4","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - MATH - Statistics Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2312.02575","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
This paper deals with surrogate modelling of a computer code output in a
hierarchical multi-fidelity context, i.e., when the output can be evaluated at
different levels of accuracy and computational cost. Using observations of the
output at low- and high-fidelity levels, we propose a method that combines
Gaussian process (GP) regression and Bayesian neural network (BNN), in a method
called GPBNN. The low-fidelity output is treated as a single-fidelity code
using classical GP regression. The high-fidelity output is approximated by a
BNN that incorporates, in addition to the high-fidelity observations,
well-chosen realisations of the low-fidelity output emulator. The predictive
uncertainty of the final surrogate model is then quantified by a complete
characterisation of the uncertainties of the different models and their
interaction. GPBNN is compared with most of the multi-fidelity regression
methods allowing to quantify the prediction uncertainty.