贝叶斯广义非线性模型的进化变异推理

Philip Sebastian Hauglie Sommerfelt, Aliaksandr Hubin
{"title":"贝叶斯广义非线性模型的进化变异推理","authors":"Philip Sebastian Hauglie Sommerfelt, Aliaksandr Hubin","doi":"10.1007/s00521-024-10349-1","DOIUrl":null,"url":null,"abstract":"<p>In the exploration of recently developed Bayesian Generalized Nonlinear Models (BGNLM), this paper proposes a pragmatic scalable approximation for computing posterior distributions. Traditional Markov chain Monte Carlo within the populations of the Genetically Modified Mode Jumping Markov Chain Monte Carlo (GMJMCMC) algorithm is an NP-hard search problem. To linearize them, we suggest using instead variational Bayes, employing either mean-field approximation or normalizing flows for simplicity and scalability. This results in an evolutionary variational Bayes algorithm as a more scalable alternative to GMJMCMC. Through practical applications including inference on Bayesian linear models, Bayesian fractional polynomials, and full BGNLM, we demonstrate the effectiveness of our method, delivering accurate predictions, transparency and interpretations, and accessible measures of uncertainty, while improving the scalability of BGNLM inference through on the one hand using a novel variational Bayes method, but, on the other hand, enabling the use of GPUs for computations.</p>","PeriodicalId":18925,"journal":{"name":"Neural Computing and Applications","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-08-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Evolutionary variational inference for Bayesian generalized nonlinear models\",\"authors\":\"Philip Sebastian Hauglie Sommerfelt, Aliaksandr Hubin\",\"doi\":\"10.1007/s00521-024-10349-1\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>In the exploration of recently developed Bayesian Generalized Nonlinear Models (BGNLM), this paper proposes a pragmatic scalable approximation for computing posterior distributions. Traditional Markov chain Monte Carlo within the populations of the Genetically Modified Mode Jumping Markov Chain Monte Carlo (GMJMCMC) algorithm is an NP-hard search problem. To linearize them, we suggest using instead variational Bayes, employing either mean-field approximation or normalizing flows for simplicity and scalability. This results in an evolutionary variational Bayes algorithm as a more scalable alternative to GMJMCMC. Through practical applications including inference on Bayesian linear models, Bayesian fractional polynomials, and full BGNLM, we demonstrate the effectiveness of our method, delivering accurate predictions, transparency and interpretations, and accessible measures of uncertainty, while improving the scalability of BGNLM inference through on the one hand using a novel variational Bayes method, but, on the other hand, enabling the use of GPUs for computations.</p>\",\"PeriodicalId\":18925,\"journal\":{\"name\":\"Neural Computing and Applications\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Computing and Applications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1007/s00521-024-10349-1\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Computing and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s00521-024-10349-1","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

在探索最新开发的贝叶斯广义非线性模型(BGNLM)时,本文提出了一种计算后验分布的实用可扩展近似方法。遗传修正模式跳跃马尔可夫链蒙特卡洛(GMJMCMC)算法种群内的传统马尔可夫链蒙特卡洛是一个 NP-困难搜索问题。为了使其线性化,我们建议使用变异贝叶斯,采用均值场近似或归一化流量,以简化算法并提高可扩展性。这就产生了一种进化变异贝叶斯算法,作为 GMJMCMC 更具可扩展性的替代方案。通过对贝叶斯线性模型、贝叶斯分数多项式和完整 BGNLM 的推理等实际应用,我们证明了我们的方法的有效性,它提供了准确的预测、透明度和解释,以及可获得的不确定性度量,同时一方面通过使用新颖的变分贝叶斯方法,另一方面通过使用 GPU 进行计算,提高了 BGNLM 推理的可扩展性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Evolutionary variational inference for Bayesian generalized nonlinear models

Evolutionary variational inference for Bayesian generalized nonlinear models

In the exploration of recently developed Bayesian Generalized Nonlinear Models (BGNLM), this paper proposes a pragmatic scalable approximation for computing posterior distributions. Traditional Markov chain Monte Carlo within the populations of the Genetically Modified Mode Jumping Markov Chain Monte Carlo (GMJMCMC) algorithm is an NP-hard search problem. To linearize them, we suggest using instead variational Bayes, employing either mean-field approximation or normalizing flows for simplicity and scalability. This results in an evolutionary variational Bayes algorithm as a more scalable alternative to GMJMCMC. Through practical applications including inference on Bayesian linear models, Bayesian fractional polynomials, and full BGNLM, we demonstrate the effectiveness of our method, delivering accurate predictions, transparency and interpretations, and accessible measures of uncertainty, while improving the scalability of BGNLM inference through on the one hand using a novel variational Bayes method, but, on the other hand, enabling the use of GPUs for computations.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信