{"title":"大规模稀疏线性反问题的近似贝叶斯方法","authors":"Y. Altmann","doi":"10.23919/eusipco55093.2022.9909536","DOIUrl":null,"url":null,"abstract":"In this paper, we investigate and compare approximate Bayesian methods for high-dimensional linear inverse problems where sparsity-promoting prior distributions can be used to regularized the inference process. In particular, we investigate fully factorized priors which lead to multimodal and potentially non-smooth posterior distributions such as Bernoulli-Gaussian priors. In addition to the most traditional variational Bayes framework based on mean-field approximation, we compare different implementations of power expectation-propagation (EP) in terms of estimation of the posterior means and marginal variances, using fully factorized approximations. The different methods are compared using low-dimensional examples and we then discuss the potential benefits of power EP for image restoration. These preliminary results tend to confirm that in the case of Gaussian likelihoods, EP generally provides more reliable marginal variances while power EP offers more flexibility for generalised linear inverse problems.","PeriodicalId":231263,"journal":{"name":"2022 30th European Signal Processing Conference (EUSIPCO)","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-08-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"On approximate Bayesian methods for large-scale sparse linear inverse problems\",\"authors\":\"Y. Altmann\",\"doi\":\"10.23919/eusipco55093.2022.9909536\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we investigate and compare approximate Bayesian methods for high-dimensional linear inverse problems where sparsity-promoting prior distributions can be used to regularized the inference process. In particular, we investigate fully factorized priors which lead to multimodal and potentially non-smooth posterior distributions such as Bernoulli-Gaussian priors. In addition to the most traditional variational Bayes framework based on mean-field approximation, we compare different implementations of power expectation-propagation (EP) in terms of estimation of the posterior means and marginal variances, using fully factorized approximations. The different methods are compared using low-dimensional examples and we then discuss the potential benefits of power EP for image restoration. These preliminary results tend to confirm that in the case of Gaussian likelihoods, EP generally provides more reliable marginal variances while power EP offers more flexibility for generalised linear inverse problems.\",\"PeriodicalId\":231263,\"journal\":{\"name\":\"2022 30th European Signal Processing Conference (EUSIPCO)\",\"volume\":\"30 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-08-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 30th European Signal Processing Conference (EUSIPCO)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.23919/eusipco55093.2022.9909536\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 30th European Signal Processing Conference (EUSIPCO)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.23919/eusipco55093.2022.9909536","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
On approximate Bayesian methods for large-scale sparse linear inverse problems
In this paper, we investigate and compare approximate Bayesian methods for high-dimensional linear inverse problems where sparsity-promoting prior distributions can be used to regularized the inference process. In particular, we investigate fully factorized priors which lead to multimodal and potentially non-smooth posterior distributions such as Bernoulli-Gaussian priors. In addition to the most traditional variational Bayes framework based on mean-field approximation, we compare different implementations of power expectation-propagation (EP) in terms of estimation of the posterior means and marginal variances, using fully factorized approximations. The different methods are compared using low-dimensional examples and we then discuss the potential benefits of power EP for image restoration. These preliminary results tend to confirm that in the case of Gaussian likelihoods, EP generally provides more reliable marginal variances while power EP offers more flexibility for generalised linear inverse problems.