A Bayesian neural network approach to Multi-fidelity surrogate modelling

Baptiste KerleguerDAM/DIF, CMAP, Claire CannamelaDAM/DIF, Josselin GarnierCMAP
{"title":"A Bayesian neural network approach to Multi-fidelity surrogate modelling","authors":"Baptiste KerleguerDAM/DIF, CMAP, Claire CannamelaDAM/DIF, Josselin GarnierCMAP","doi":"arxiv-2312.02575","DOIUrl":null,"url":null,"abstract":"This paper deals with surrogate modelling of a computer code output in a\nhierarchical multi-fidelity context, i.e., when the output can be evaluated at\ndifferent levels of accuracy and computational cost. Using observations of the\noutput at low- and high-fidelity levels, we propose a method that combines\nGaussian process (GP) regression and Bayesian neural network (BNN), in a method\ncalled GPBNN. The low-fidelity output is treated as a single-fidelity code\nusing classical GP regression. The high-fidelity output is approximated by a\nBNN that incorporates, in addition to the high-fidelity observations,\nwell-chosen realisations of the low-fidelity output emulator. The predictive\nuncertainty of the final surrogate model is then quantified by a complete\ncharacterisation of the uncertainties of the different models and their\ninteraction. GPBNN is compared with most of the multi-fidelity regression\nmethods allowing to quantify the prediction uncertainty.","PeriodicalId":501330,"journal":{"name":"arXiv - MATH - Statistics Theory","volume":"93 4","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - MATH - Statistics Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2312.02575","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

This paper deals with surrogate modelling of a computer code output in a hierarchical multi-fidelity context, i.e., when the output can be evaluated at different levels of accuracy and computational cost. Using observations of the output at low- and high-fidelity levels, we propose a method that combines Gaussian process (GP) regression and Bayesian neural network (BNN), in a method called GPBNN. The low-fidelity output is treated as a single-fidelity code using classical GP regression. The high-fidelity output is approximated by a BNN that incorporates, in addition to the high-fidelity observations, well-chosen realisations of the low-fidelity output emulator. The predictive uncertainty of the final surrogate model is then quantified by a complete characterisation of the uncertainties of the different models and their interaction. GPBNN is compared with most of the multi-fidelity regression methods allowing to quantify the prediction uncertainty.
多保真度代理模型的贝叶斯神经网络方法
本文讨论了在分层多保真环境下计算机代码输出的代理建模,即当输出可以在不同的精度和计算成本水平上进行评估时。利用对低保真度和高保真度输出的观察,我们提出了一种将高斯过程(GP)回归和贝叶斯神经网络(BNN)相结合的方法,该方法称为GPBNN。使用经典GP回归将低保真输出处理为单保真编码。高保真输出由aBNN近似,除了高保真观察外,aBNN还结合了低保真输出模拟器的精心实现。然后,通过对不同模型及其相互作用的不确定性的完整描述,对最终代理模型的预测不确定性进行量化。GPBNN与大多数多保真度回归方法进行了比较,可以量化预测的不确定性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信