{"title":"贝叶斯优化顺序代理(BOSS)算法:针对广泛的贝叶斯层次模型的快速贝叶斯推理","authors":"Dayi Li , Ziang Zhang","doi":"10.1016/j.csda.2025.108253","DOIUrl":null,"url":null,"abstract":"<div><div>Approximate Bayesian inference based on Laplace approximation and quadrature has become increasingly popular for its efficiency in fitting latent Gaussian models (LGM). However, many useful models can only be fitted as LGMs if some conditioning parameters are fixed. Such models are termed conditional LGMs, with examples including change-point detection, non-linear regression, and many others. Existing methods for fitting conditional LGMs rely on grid search or sampling-based approaches to explore the posterior density of the conditioning parameters; both require a large number of evaluations of the unnormalized posterior density of the conditioning parameters. Since each evaluation requires fitting a separate LGM, these methods become computationally prohibitive beyond simple scenarios. In this work, the Bayesian Optimization Sequential Surrogate (BOSS) algorithm is introduced, which combines Bayesian optimization with approximate Bayesian inference methods to significantly reduce the computational resources required for fitting conditional LGMs. With orders of magnitude fewer evaluations than those required by the existing methods, BOSS efficiently generates sequential design points that capture the majority of the posterior mass of the conditioning parameters and subsequently yields an accurate surrogate posterior distribution that can be easily normalized. The efficiency, accuracy, and practical utility of BOSS are demonstrated through extensive simulation studies and real-world applications in epidemiology, environmental sciences, and astrophysics.</div></div>","PeriodicalId":55225,"journal":{"name":"Computational Statistics & Data Analysis","volume":"213 ","pages":"Article 108253"},"PeriodicalIF":1.6000,"publicationDate":"2025-07-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Bayesian optimization sequential surrogate (BOSS) algorithm: Fast Bayesian inference for a broad class of Bayesian hierarchical models\",\"authors\":\"Dayi Li , Ziang Zhang\",\"doi\":\"10.1016/j.csda.2025.108253\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Approximate Bayesian inference based on Laplace approximation and quadrature has become increasingly popular for its efficiency in fitting latent Gaussian models (LGM). However, many useful models can only be fitted as LGMs if some conditioning parameters are fixed. Such models are termed conditional LGMs, with examples including change-point detection, non-linear regression, and many others. Existing methods for fitting conditional LGMs rely on grid search or sampling-based approaches to explore the posterior density of the conditioning parameters; both require a large number of evaluations of the unnormalized posterior density of the conditioning parameters. Since each evaluation requires fitting a separate LGM, these methods become computationally prohibitive beyond simple scenarios. In this work, the Bayesian Optimization Sequential Surrogate (BOSS) algorithm is introduced, which combines Bayesian optimization with approximate Bayesian inference methods to significantly reduce the computational resources required for fitting conditional LGMs. With orders of magnitude fewer evaluations than those required by the existing methods, BOSS efficiently generates sequential design points that capture the majority of the posterior mass of the conditioning parameters and subsequently yields an accurate surrogate posterior distribution that can be easily normalized. The efficiency, accuracy, and practical utility of BOSS are demonstrated through extensive simulation studies and real-world applications in epidemiology, environmental sciences, and astrophysics.</div></div>\",\"PeriodicalId\":55225,\"journal\":{\"name\":\"Computational Statistics & Data Analysis\",\"volume\":\"213 \",\"pages\":\"Article 108253\"},\"PeriodicalIF\":1.6000,\"publicationDate\":\"2025-07-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computational Statistics & Data Analysis\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S016794732500129X\",\"RegionNum\":3,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computational Statistics & Data Analysis","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S016794732500129X","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
Bayesian optimization sequential surrogate (BOSS) algorithm: Fast Bayesian inference for a broad class of Bayesian hierarchical models
Approximate Bayesian inference based on Laplace approximation and quadrature has become increasingly popular for its efficiency in fitting latent Gaussian models (LGM). However, many useful models can only be fitted as LGMs if some conditioning parameters are fixed. Such models are termed conditional LGMs, with examples including change-point detection, non-linear regression, and many others. Existing methods for fitting conditional LGMs rely on grid search or sampling-based approaches to explore the posterior density of the conditioning parameters; both require a large number of evaluations of the unnormalized posterior density of the conditioning parameters. Since each evaluation requires fitting a separate LGM, these methods become computationally prohibitive beyond simple scenarios. In this work, the Bayesian Optimization Sequential Surrogate (BOSS) algorithm is introduced, which combines Bayesian optimization with approximate Bayesian inference methods to significantly reduce the computational resources required for fitting conditional LGMs. With orders of magnitude fewer evaluations than those required by the existing methods, BOSS efficiently generates sequential design points that capture the majority of the posterior mass of the conditioning parameters and subsequently yields an accurate surrogate posterior distribution that can be easily normalized. The efficiency, accuracy, and practical utility of BOSS are demonstrated through extensive simulation studies and real-world applications in epidemiology, environmental sciences, and astrophysics.
期刊介绍:
Computational Statistics and Data Analysis (CSDA), an Official Publication of the network Computational and Methodological Statistics (CMStatistics) and of the International Association for Statistical Computing (IASC), is an international journal dedicated to the dissemination of methodological research and applications in the areas of computational statistics and data analysis. The journal consists of four refereed sections which are divided into the following subject areas:
I) Computational Statistics - Manuscripts dealing with: 1) the explicit impact of computers on statistical methodology (e.g., Bayesian computing, bioinformatics,computer graphics, computer intensive inferential methods, data exploration, data mining, expert systems, heuristics, knowledge based systems, machine learning, neural networks, numerical and optimization methods, parallel computing, statistical databases, statistical systems), and 2) the development, evaluation and validation of statistical software and algorithms. Software and algorithms can be submitted with manuscripts and will be stored together with the online article.
II) Statistical Methodology for Data Analysis - Manuscripts dealing with novel and original data analytical strategies and methodologies applied in biostatistics (design and analytic methods for clinical trials, epidemiological studies, statistical genetics, or genetic/environmental interactions), chemometrics, classification, data exploration, density estimation, design of experiments, environmetrics, education, image analysis, marketing, model free data exploration, pattern recognition, psychometrics, statistical physics, image processing, robust procedures.
[...]
III) Special Applications - [...]
IV) Annals of Statistical Data Science [...]