{"title":"Prefetching-based Multiproposal Markov Chain Monte Carlo Algorithm","authors":"Guifeng Ye;Shaowen Lu","doi":"10.1109/TAI.2024.3385384","DOIUrl":null,"url":null,"abstract":"Our proposed algorithm is a prefetching-based multiproposal Markov Chain Monte Carlo (PMP-MCMC) method that efficiently explores the target distribution by combining multiple proposals with the concept of prefetching. In our method, not all proposals are directly derived from the current state; some are derived from future states. This approach breaks through the inherent sequential characteristics of traditional MCMC algorithms. Compared with single-proposal and multiproposal methods, our approach speeds up by \n<inline-formula><tex-math>$K$</tex-math></inline-formula>\n times and the burn-in period is reduced by a factor of \n<inline-formula><tex-math>$1/\\text{log}_{2}K$</tex-math></inline-formula>\n maximally, where \n<inline-formula><tex-math>$K$</tex-math></inline-formula>\n is the number of parallel computational units or processing cores. Compared with prefetching method, our method has increased the number of samples per iteration by a factor of \n<inline-formula><tex-math>$K/\\text{log}_{2}K$</tex-math></inline-formula>\n. Furthermore, the proposed method is general and can be integrated into MCMC variants such as Hamiltonian Monte Carlo (HMC). We have also applied this method to optimize the model parameters of neural networks and Bayesian inference and observed significant improvements in optimization performance.","PeriodicalId":73305,"journal":{"name":"IEEE transactions on artificial intelligence","volume":"5 9","pages":"4493-4505"},"PeriodicalIF":0.0000,"publicationDate":"2024-04-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on artificial intelligence","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10494066/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Our proposed algorithm is a prefetching-based multiproposal Markov Chain Monte Carlo (PMP-MCMC) method that efficiently explores the target distribution by combining multiple proposals with the concept of prefetching. In our method, not all proposals are directly derived from the current state; some are derived from future states. This approach breaks through the inherent sequential characteristics of traditional MCMC algorithms. Compared with single-proposal and multiproposal methods, our approach speeds up by
$K$
times and the burn-in period is reduced by a factor of
$1/\text{log}_{2}K$
maximally, where
$K$
is the number of parallel computational units or processing cores. Compared with prefetching method, our method has increased the number of samples per iteration by a factor of
$K/\text{log}_{2}K$
. Furthermore, the proposed method is general and can be integrated into MCMC variants such as Hamiltonian Monte Carlo (HMC). We have also applied this method to optimize the model parameters of neural networks and Bayesian inference and observed significant improvements in optimization performance.