Prefetching-based Multiproposal Markov Chain Monte Carlo Algorithm

Guifeng Ye;Shaowen Lu
{"title":"Prefetching-based Multiproposal Markov Chain Monte Carlo Algorithm","authors":"Guifeng Ye;Shaowen Lu","doi":"10.1109/TAI.2024.3385384","DOIUrl":null,"url":null,"abstract":"Our proposed algorithm is a prefetching-based multiproposal Markov Chain Monte Carlo (PMP-MCMC) method that efficiently explores the target distribution by combining multiple proposals with the concept of prefetching. In our method, not all proposals are directly derived from the current state; some are derived from future states. This approach breaks through the inherent sequential characteristics of traditional MCMC algorithms. Compared with single-proposal and multiproposal methods, our approach speeds up by \n<inline-formula><tex-math>$K$</tex-math></inline-formula>\n times and the burn-in period is reduced by a factor of \n<inline-formula><tex-math>$1/\\text{log}_{2}K$</tex-math></inline-formula>\n maximally, where \n<inline-formula><tex-math>$K$</tex-math></inline-formula>\n is the number of parallel computational units or processing cores. Compared with prefetching method, our method has increased the number of samples per iteration by a factor of \n<inline-formula><tex-math>$K/\\text{log}_{2}K$</tex-math></inline-formula>\n. Furthermore, the proposed method is general and can be integrated into MCMC variants such as Hamiltonian Monte Carlo (HMC). We have also applied this method to optimize the model parameters of neural networks and Bayesian inference and observed significant improvements in optimization performance.","PeriodicalId":73305,"journal":{"name":"IEEE transactions on artificial intelligence","volume":"5 9","pages":"4493-4505"},"PeriodicalIF":0.0000,"publicationDate":"2024-04-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on artificial intelligence","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10494066/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Our proposed algorithm is a prefetching-based multiproposal Markov Chain Monte Carlo (PMP-MCMC) method that efficiently explores the target distribution by combining multiple proposals with the concept of prefetching. In our method, not all proposals are directly derived from the current state; some are derived from future states. This approach breaks through the inherent sequential characteristics of traditional MCMC algorithms. Compared with single-proposal and multiproposal methods, our approach speeds up by $K$ times and the burn-in period is reduced by a factor of $1/\text{log}_{2}K$ maximally, where $K$ is the number of parallel computational units or processing cores. Compared with prefetching method, our method has increased the number of samples per iteration by a factor of $K/\text{log}_{2}K$ . Furthermore, the proposed method is general and can be integrated into MCMC variants such as Hamiltonian Monte Carlo (HMC). We have also applied this method to optimize the model parameters of neural networks and Bayesian inference and observed significant improvements in optimization performance.
基于预取的多提案马尔可夫链蒙特卡洛算法
我们提出的算法是一种基于预取的多方案马尔可夫链蒙特卡罗(PMP-MCMC)方法,通过将多个方案与预取概念相结合,有效地探索目标分布。在我们的方法中,并非所有建议都直接来自当前状态;有些建议来自未来状态。这种方法突破了传统 MCMC 算法固有的顺序特性。与单提案法和多提案法相比,我们的方法速度提高了 $K$ 倍,烧入期最大缩短了 1/text{log}_{2}K$ 倍,其中 $K$ 是并行计算单元或处理核心的数量。与预取方法相比,我们的方法将每次迭代的样本数量增加了 $K/\text{log}_{2}K$。此外,我们提出的方法具有通用性,可以集成到 MCMC 变体中,如汉密尔顿蒙特卡罗(HMC)。我们还将这种方法应用于优化神经网络和贝叶斯推理的模型参数,并观察到优化性能的显著提高。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
7.70
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信