使用导数信息潜注意神经算子的序列无限维贝叶斯优化实验设计

Jinwoo Go, Peng Chen
{"title":"使用导数信息潜注意神经算子的序列无限维贝叶斯优化实验设计","authors":"Jinwoo Go, Peng Chen","doi":"arxiv-2409.09141","DOIUrl":null,"url":null,"abstract":"In this work, we develop a new computational framework to solve sequential\nBayesian experimental design (SBOED) problems constrained by large-scale\npartial differential equations with infinite-dimensional random parameters. We\npropose an adaptive terminal formulation of the optimality criteria for SBOED\nto achieve adaptive global optimality. We also establish an equivalent\noptimization formulation to achieve computational simplicity enabled by Laplace\nand low-rank approximations of the posterior. To accelerate the solution of the\nSBOED problem, we develop a derivative-informed latent attention neural\noperator (LANO), a new neural network surrogate model that leverages (1)\nderivative-informed dimension reduction for latent encoding, (2) an attention\nmechanism to capture the dynamics in the latent space, (3) an efficient\ntraining in the latent space augmented by projected Jacobian, which\ncollectively lead to an efficient, accurate, and scalable surrogate in\ncomputing not only the parameter-to-observable (PtO) maps but also their\nJacobians. We further develop the formulation for the computation of the MAP\npoints, the eigenpairs, and the sampling from posterior by LANO in the reduced\nspaces and use these computations to solve the SBOED problem. We demonstrate\nthe superior accuracy of LANO compared to two other neural architectures and\nthe high accuracy of LANO compared to the finite element method (FEM) for the\ncomputation of MAP points in solving the SBOED problem with application to the\nexperimental design of the time to take MRI images in monitoring tumor growth.","PeriodicalId":501309,"journal":{"name":"arXiv - CS - Computational Engineering, Finance, and Science","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Sequential infinite-dimensional Bayesian optimal experimental design with derivative-informed latent attention neural operator\",\"authors\":\"Jinwoo Go, Peng Chen\",\"doi\":\"arxiv-2409.09141\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this work, we develop a new computational framework to solve sequential\\nBayesian experimental design (SBOED) problems constrained by large-scale\\npartial differential equations with infinite-dimensional random parameters. We\\npropose an adaptive terminal formulation of the optimality criteria for SBOED\\nto achieve adaptive global optimality. We also establish an equivalent\\noptimization formulation to achieve computational simplicity enabled by Laplace\\nand low-rank approximations of the posterior. To accelerate the solution of the\\nSBOED problem, we develop a derivative-informed latent attention neural\\noperator (LANO), a new neural network surrogate model that leverages (1)\\nderivative-informed dimension reduction for latent encoding, (2) an attention\\nmechanism to capture the dynamics in the latent space, (3) an efficient\\ntraining in the latent space augmented by projected Jacobian, which\\ncollectively lead to an efficient, accurate, and scalable surrogate in\\ncomputing not only the parameter-to-observable (PtO) maps but also their\\nJacobians. We further develop the formulation for the computation of the MAP\\npoints, the eigenpairs, and the sampling from posterior by LANO in the reduced\\nspaces and use these computations to solve the SBOED problem. We demonstrate\\nthe superior accuracy of LANO compared to two other neural architectures and\\nthe high accuracy of LANO compared to the finite element method (FEM) for the\\ncomputation of MAP points in solving the SBOED problem with application to the\\nexperimental design of the time to take MRI images in monitoring tumor growth.\",\"PeriodicalId\":501309,\"journal\":{\"name\":\"arXiv - CS - Computational Engineering, Finance, and Science\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Computational Engineering, Finance, and Science\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.09141\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Computational Engineering, Finance, and Science","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.09141","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

在这项工作中,我们开发了一种新的计算框架,用于解决受无限维随机参数大尺度局部微分方程约束的序列贝叶斯实验设计(SBOED)问题。我们为 SBOED 的最优性准则提出了一种自适应终端表述,以实现自适应全局最优性。我们还建立了一个等效优化公式,通过对后验的拉普拉斯和低秩近似来实现计算的简便性。为了加速解决SBOED问题,我们开发了一种导数信息潜注意神经操作器(LANO),这是一种新的神经网络代理模型,它利用(1)导数信息降维进行潜编码、(2) 一种捕捉潜空间动态的注意力机制,(3) 一种由投影雅各比增强的潜空间高效训练,这些因素共同导致了一种高效、准确和可扩展的代理模型,不仅能计算参数到可观测(PtO)映射,还能计算它们的雅各比。我们进一步开发了计算 MAP 点、特征对的公式,以及在还原空间中通过 LANO 从后向采样的公式,并利用这些计算来解决 SBOED 问题。我们证明了在解决 SBOED 问题时,与其他两种神经架构相比,LANO 具有更高的准确性;在计算 MAP 点时,与有限元法 (FEM) 相比,LANO 具有更高的准确性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Sequential infinite-dimensional Bayesian optimal experimental design with derivative-informed latent attention neural operator
In this work, we develop a new computational framework to solve sequential Bayesian experimental design (SBOED) problems constrained by large-scale partial differential equations with infinite-dimensional random parameters. We propose an adaptive terminal formulation of the optimality criteria for SBOED to achieve adaptive global optimality. We also establish an equivalent optimization formulation to achieve computational simplicity enabled by Laplace and low-rank approximations of the posterior. To accelerate the solution of the SBOED problem, we develop a derivative-informed latent attention neural operator (LANO), a new neural network surrogate model that leverages (1) derivative-informed dimension reduction for latent encoding, (2) an attention mechanism to capture the dynamics in the latent space, (3) an efficient training in the latent space augmented by projected Jacobian, which collectively lead to an efficient, accurate, and scalable surrogate in computing not only the parameter-to-observable (PtO) maps but also their Jacobians. We further develop the formulation for the computation of the MAP points, the eigenpairs, and the sampling from posterior by LANO in the reduced spaces and use these computations to solve the SBOED problem. We demonstrate the superior accuracy of LANO compared to two other neural architectures and the high accuracy of LANO compared to the finite element method (FEM) for the computation of MAP points in solving the SBOED problem with application to the experimental design of the time to take MRI images in monitoring tumor growth.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信