Rafael Flock, Yiqiu Dong, Felipe Uribe, Olivier Zahm
{"title":"Continuous Gaussian mixture solution for linear Bayesian inversion with application to Laplace priors","authors":"Rafael Flock, Yiqiu Dong, Felipe Uribe, Olivier Zahm","doi":"arxiv-2408.16594","DOIUrl":null,"url":null,"abstract":"We focus on Bayesian inverse problems with Gaussian likelihood, linear\nforward model, and priors that can be formulated as a Gaussian mixture. Such a\nmixture is expressed as an integral of Gaussian density functions weighted by a\nmixing density over the mixing variables. Within this framework, the\ncorresponding posterior distribution also takes the form of a Gaussian mixture,\nand we derive the closed-form expression for its posterior mixing density. To\nsample from the posterior Gaussian mixture, we propose a two-step sampling\nmethod. First, we sample the mixture variables from the posterior mixing\ndensity, and then we sample the variables of interest from Gaussian densities\nconditioned on the sampled mixing variables. However, the posterior mixing\ndensity is relatively difficult to sample from, especially in high dimensions.\nTherefore, we propose to replace the posterior mixing density by a\ndimension-reduced approximation, and we provide a bound in the Hellinger\ndistance for the resulting approximate posterior. We apply the proposed\napproach to a posterior with Laplace prior, where we introduce two\ndimension-reduced approximations for the posterior mixing density. Our\nnumerical experiments indicate that samples generated via the proposed\napproximations have very low correlation and are close to the exact posterior.","PeriodicalId":501215,"journal":{"name":"arXiv - STAT - Computation","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-08-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - STAT - Computation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2408.16594","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
We focus on Bayesian inverse problems with Gaussian likelihood, linear
forward model, and priors that can be formulated as a Gaussian mixture. Such a
mixture is expressed as an integral of Gaussian density functions weighted by a
mixing density over the mixing variables. Within this framework, the
corresponding posterior distribution also takes the form of a Gaussian mixture,
and we derive the closed-form expression for its posterior mixing density. To
sample from the posterior Gaussian mixture, we propose a two-step sampling
method. First, we sample the mixture variables from the posterior mixing
density, and then we sample the variables of interest from Gaussian densities
conditioned on the sampled mixing variables. However, the posterior mixing
density is relatively difficult to sample from, especially in high dimensions.
Therefore, we propose to replace the posterior mixing density by a
dimension-reduced approximation, and we provide a bound in the Hellinger
distance for the resulting approximate posterior. We apply the proposed
approach to a posterior with Laplace prior, where we introduce two
dimension-reduced approximations for the posterior mixing density. Our
numerical experiments indicate that samples generated via the proposed
approximations have very low correlation and are close to the exact posterior.