用吉布斯弹跳粒子采样器解决大规模线性逆问题的融合 $L_{1/2}$ 先验

Xiongwen Ke, Yanan Fan, Qingping Zhou
{"title":"用吉布斯弹跳粒子采样器解决大规模线性逆问题的融合 $L_{1/2}$ 先验","authors":"Xiongwen Ke, Yanan Fan, Qingping Zhou","doi":"arxiv-2409.07874","DOIUrl":null,"url":null,"abstract":"In this paper, we study Bayesian approach for solving large scale linear\ninverse problems arising in various scientific and engineering fields. We\npropose a fused $L_{1/2}$ prior with edge-preserving and sparsity-promoting\nproperties and show that it can be formulated as a Gaussian mixture Markov\nrandom field. Since the density function of this family of prior is neither\nlog-concave nor Lipschitz, gradient-based Markov chain Monte Carlo methods can\nnot be applied to sample the posterior. Thus, we present a Gibbs sampler in\nwhich all the conditional posteriors involved have closed form expressions. The\nGibbs sampler works well for small size problems but it is computationally\nintractable for large scale problems due to the need for sample high\ndimensional Gaussian distribution. To reduce the computation burden, we\nconstruct a Gibbs bouncy particle sampler (Gibbs-BPS) based on a piecewise\ndeterministic Markov process. This new sampler combines elements of Gibbs\nsampler with bouncy particle sampler and its computation complexity is an order\nof magnitude smaller. We show that the new sampler converges to the target\ndistribution. With computed tomography examples, we demonstrate that the\nproposed method shows competitive performance with existing popular Bayesian\nmethods and is highly efficient in large scale problems.","PeriodicalId":501425,"journal":{"name":"arXiv - STAT - Methodology","volume":"23 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Fused $L_{1/2}$ prior for large scale linear inverse problem with Gibbs bouncy particle sampler\",\"authors\":\"Xiongwen Ke, Yanan Fan, Qingping Zhou\",\"doi\":\"arxiv-2409.07874\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we study Bayesian approach for solving large scale linear\\ninverse problems arising in various scientific and engineering fields. We\\npropose a fused $L_{1/2}$ prior with edge-preserving and sparsity-promoting\\nproperties and show that it can be formulated as a Gaussian mixture Markov\\nrandom field. Since the density function of this family of prior is neither\\nlog-concave nor Lipschitz, gradient-based Markov chain Monte Carlo methods can\\nnot be applied to sample the posterior. Thus, we present a Gibbs sampler in\\nwhich all the conditional posteriors involved have closed form expressions. The\\nGibbs sampler works well for small size problems but it is computationally\\nintractable for large scale problems due to the need for sample high\\ndimensional Gaussian distribution. To reduce the computation burden, we\\nconstruct a Gibbs bouncy particle sampler (Gibbs-BPS) based on a piecewise\\ndeterministic Markov process. This new sampler combines elements of Gibbs\\nsampler with bouncy particle sampler and its computation complexity is an order\\nof magnitude smaller. We show that the new sampler converges to the target\\ndistribution. With computed tomography examples, we demonstrate that the\\nproposed method shows competitive performance with existing popular Bayesian\\nmethods and is highly efficient in large scale problems.\",\"PeriodicalId\":501425,\"journal\":{\"name\":\"arXiv - STAT - Methodology\",\"volume\":\"23 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - STAT - Methodology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.07874\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - STAT - Methodology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.07874","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

本文研究了解决各种科学和工程领域中出现的大规模线性逆问题的贝叶斯方法。我们提出了一种具有边缘保留和稀疏性促进特性的融合 $L_{1/2}$ 先验,并证明它可以表述为高斯混合马尔可夫随机场。由于这一系列先验的密度函数既非 log-concave 也非 Lipschitz,因此基于梯度的马尔可夫链蒙特卡罗方法无法用于后验采样。因此,我们提出了一种吉布斯采样器,其中涉及的所有条件后验都有封闭的表达式。吉布斯采样器在处理小规模问题时效果很好,但在处理大规模问题时,由于需要对高维高斯分布进行采样,在计算上非常棘手。为了减轻计算负担,我们在片断确定性马尔可夫过程的基础上构建了吉布斯弹性粒子采样器(Gibbs-BPS)。这种新采样器结合了吉布斯采样器和弹跳粒子采样器的元素,其计算复杂度小了一个数量级。我们证明新采样器收敛于目标分布。我们以计算机断层扫描为例,证明了所提出的方法与现有流行的贝叶斯方法相比具有竞争性的性能,而且在大规模问题上非常高效。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Fused $L_{1/2}$ prior for large scale linear inverse problem with Gibbs bouncy particle sampler
In this paper, we study Bayesian approach for solving large scale linear inverse problems arising in various scientific and engineering fields. We propose a fused $L_{1/2}$ prior with edge-preserving and sparsity-promoting properties and show that it can be formulated as a Gaussian mixture Markov random field. Since the density function of this family of prior is neither log-concave nor Lipschitz, gradient-based Markov chain Monte Carlo methods can not be applied to sample the posterior. Thus, we present a Gibbs sampler in which all the conditional posteriors involved have closed form expressions. The Gibbs sampler works well for small size problems but it is computationally intractable for large scale problems due to the need for sample high dimensional Gaussian distribution. To reduce the computation burden, we construct a Gibbs bouncy particle sampler (Gibbs-BPS) based on a piecewise deterministic Markov process. This new sampler combines elements of Gibbs sampler with bouncy particle sampler and its computation complexity is an order of magnitude smaller. We show that the new sampler converges to the target distribution. With computed tomography examples, we demonstrate that the proposed method shows competitive performance with existing popular Bayesian methods and is highly efficient in large scale problems.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信