Philip Sebastian Hauglie Sommerfelt, Aliaksandr Hubin
{"title":"贝叶斯广义非线性模型的进化变异推理","authors":"Philip Sebastian Hauglie Sommerfelt, Aliaksandr Hubin","doi":"10.1007/s00521-024-10349-1","DOIUrl":null,"url":null,"abstract":"<p>In the exploration of recently developed Bayesian Generalized Nonlinear Models (BGNLM), this paper proposes a pragmatic scalable approximation for computing posterior distributions. Traditional Markov chain Monte Carlo within the populations of the Genetically Modified Mode Jumping Markov Chain Monte Carlo (GMJMCMC) algorithm is an NP-hard search problem. To linearize them, we suggest using instead variational Bayes, employing either mean-field approximation or normalizing flows for simplicity and scalability. This results in an evolutionary variational Bayes algorithm as a more scalable alternative to GMJMCMC. Through practical applications including inference on Bayesian linear models, Bayesian fractional polynomials, and full BGNLM, we demonstrate the effectiveness of our method, delivering accurate predictions, transparency and interpretations, and accessible measures of uncertainty, while improving the scalability of BGNLM inference through on the one hand using a novel variational Bayes method, but, on the other hand, enabling the use of GPUs for computations.</p>","PeriodicalId":18925,"journal":{"name":"Neural Computing and Applications","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-08-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Evolutionary variational inference for Bayesian generalized nonlinear models\",\"authors\":\"Philip Sebastian Hauglie Sommerfelt, Aliaksandr Hubin\",\"doi\":\"10.1007/s00521-024-10349-1\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>In the exploration of recently developed Bayesian Generalized Nonlinear Models (BGNLM), this paper proposes a pragmatic scalable approximation for computing posterior distributions. Traditional Markov chain Monte Carlo within the populations of the Genetically Modified Mode Jumping Markov Chain Monte Carlo (GMJMCMC) algorithm is an NP-hard search problem. To linearize them, we suggest using instead variational Bayes, employing either mean-field approximation or normalizing flows for simplicity and scalability. This results in an evolutionary variational Bayes algorithm as a more scalable alternative to GMJMCMC. Through practical applications including inference on Bayesian linear models, Bayesian fractional polynomials, and full BGNLM, we demonstrate the effectiveness of our method, delivering accurate predictions, transparency and interpretations, and accessible measures of uncertainty, while improving the scalability of BGNLM inference through on the one hand using a novel variational Bayes method, but, on the other hand, enabling the use of GPUs for computations.</p>\",\"PeriodicalId\":18925,\"journal\":{\"name\":\"Neural Computing and Applications\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Computing and Applications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1007/s00521-024-10349-1\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Computing and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s00521-024-10349-1","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Evolutionary variational inference for Bayesian generalized nonlinear models
In the exploration of recently developed Bayesian Generalized Nonlinear Models (BGNLM), this paper proposes a pragmatic scalable approximation for computing posterior distributions. Traditional Markov chain Monte Carlo within the populations of the Genetically Modified Mode Jumping Markov Chain Monte Carlo (GMJMCMC) algorithm is an NP-hard search problem. To linearize them, we suggest using instead variational Bayes, employing either mean-field approximation or normalizing flows for simplicity and scalability. This results in an evolutionary variational Bayes algorithm as a more scalable alternative to GMJMCMC. Through practical applications including inference on Bayesian linear models, Bayesian fractional polynomials, and full BGNLM, we demonstrate the effectiveness of our method, delivering accurate predictions, transparency and interpretations, and accessible measures of uncertainty, while improving the scalability of BGNLM inference through on the one hand using a novel variational Bayes method, but, on the other hand, enabling the use of GPUs for computations.