Bayesian Analysis最新文献

筛选
英文 中文
Improving multilevel regression and poststratification with structured priors. 利用结构化先验改进多层次回归和后分层。
IF 4.9 2区 数学
Bayesian Analysis Pub Date : 2021-09-01 Epub Date: 2020-07-15 DOI: 10.1214/20-ba1223
Yuxiang Gao, Lauren Kennedy, Daniel Simpson, Andrew Gelman
{"title":"Improving multilevel regression and poststratification with structured priors.","authors":"Yuxiang Gao, Lauren Kennedy, Daniel Simpson, Andrew Gelman","doi":"10.1214/20-ba1223","DOIUrl":"10.1214/20-ba1223","url":null,"abstract":"<p><p>A central theme in the field of survey statistics is estimating population-level quantities through data coming from potentially non-representative samples of the population. Multilevel regression and poststratification (MRP), a model-based approach, is gaining traction against the traditional weighted approach for survey estimates. MRP estimates are susceptible to bias if there is an underlying structure that the methodology does not capture. This work aims to provide a new framework for specifying structured prior distributions that lead to bias reduction in MRP estimates. We use simulation studies to explore the benefit of these prior distributions and demonstrate their efficacy on non-representative US survey data. We show that structured prior distributions offer absolute bias reduction and variance reduction for posterior MRP estimates in a large variety of data regimes.</p>","PeriodicalId":55398,"journal":{"name":"Bayesian Analysis","volume":"16 3","pages":"719-744"},"PeriodicalIF":4.9,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9203002/pdf/nihms-1811398.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"40000730","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Contaminated Gibbs-Type Priors 受污染的吉布斯型先验
IF 4.4 2区 数学
Bayesian Analysis Pub Date : 2021-08-26 DOI: 10.1214/22-ba1358
F. Camerlenghi, R. Corradin, A. Ongaro
{"title":"Contaminated Gibbs-Type Priors","authors":"F. Camerlenghi, R. Corradin, A. Ongaro","doi":"10.1214/22-ba1358","DOIUrl":"https://doi.org/10.1214/22-ba1358","url":null,"abstract":"Gibbs-type priors are widely used as key components in several Bayesian nonparametric models. By virtue of their flexibility and mathematical tractability, they turn out to be predominant priors in species sampling problems, clustering and mixture modelling. We introduce a new family of processes which extend the Gibbs-type one, by including a contaminant component in the model to account for the presence of anomalies (outliers) or an excess of observations with frequency one. We first investigate the induced random partition, the associated predictive distribution and we characterize the asymptotic behaviour of the number of clusters. All the results we obtain are in closed form and easily interpretable, as a noteworthy example we focus on the contaminated version of the Pitman-Yor process. Finally we pinpoint the advantage of our construction in different applied problems: we show how the contaminant component helps to perform outlier detection for an astronomical clustering problem and to improve predictive inference in a speciesrelated dataset, exhibiting a high number of species with frequency one.","PeriodicalId":55398,"journal":{"name":"Bayesian Analysis","volume":" ","pages":""},"PeriodicalIF":4.4,"publicationDate":"2021-08-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47383584","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The Attraction Indian Buffet Distribution 吸引力印度自助餐分布
IF 4.4 2区 数学
Bayesian Analysis Pub Date : 2021-06-09 DOI: 10.1214/21-ba1279
R. Warr, D. B. Dahl, Jeremy M. Meyer, Arthur Lui
{"title":"The Attraction Indian Buffet Distribution","authors":"R. Warr, D. B. Dahl, Jeremy M. Meyer, Arthur Lui","doi":"10.1214/21-ba1279","DOIUrl":"https://doi.org/10.1214/21-ba1279","url":null,"abstract":"We propose the attraction Indian buffet distribution (AIBD), a distribution for binary feature matrices influenced by pairwise similarity information. Binary feature matrices are used in Bayesian models to uncover latent variables (i.e., features) that explain observed data. The Indian buffet process (IBP) is a popular exchangeable prior distribution for latent feature matrices. In the presence of additional information, however, the exchangeability assumption is not reasonable or desirable. The AIBD can incorporate pairwise similarity information, yet it preserves many properties of the IBP, including the distribution of the total number of features. Thus, much of the interpretation and intuition that one has for the IBP directly carries over to the AIBD. A temperature parameter controls the degree to which the similarity information affects feature-sharing between observations. Unlike other nonexchangeable distributions for feature allocations, the probability mass function of the AIBD has a tractable normalizing constant, making posterior inference on hyperparameters straight-forward using standard MCMC methods. A novel posterior sampling algorithm is proposed for the IBP and the AIBD. We demonstrate the feasibility of the AIBD as a prior distribution in feature allocation models and compare the performance of competing methods in simulations and an application.","PeriodicalId":55398,"journal":{"name":"Bayesian Analysis","volume":" ","pages":""},"PeriodicalIF":4.4,"publicationDate":"2021-06-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49171208","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Seemingly Unrelated Multi-State Processes: A Bayesian Semiparametric Approach 看似无关的多状态过程:贝叶斯半参数方法
IF 4.4 2区 数学
Bayesian Analysis Pub Date : 2021-06-06 DOI: 10.1214/22-ba1326
Andrea Cremaschi, Raffele Argiento, M. Iorio, S. Cai, Y. Chong, M. Meaney, Michelle Z L Kee
{"title":"Seemingly Unrelated Multi-State Processes: A Bayesian Semiparametric Approach","authors":"Andrea Cremaschi, Raffele Argiento, M. Iorio, S. Cai, Y. Chong, M. Meaney, Michelle Z L Kee","doi":"10.1214/22-ba1326","DOIUrl":"https://doi.org/10.1214/22-ba1326","url":null,"abstract":"Many applications in medical statistics as well as in other fields can be described by transitions between multiple states (e.g. from health to disease) experienced by individuals over time. In this context, multi-state models are a popular statistical technique, in particular when the exact transition times are not observed. The key quantities of interest are the transition rates, capturing the instantaneous risk of moving from one state to another. The main contribution of this work is to propose a joint semiparametric model for several possibly related multi-state processes (Seemingly Unrelated Multi-State, SUMS, processes), assuming a Markov structure for the transitions over time. The dependence between different processes is captured by specifying a joint random effect distribution on the transition rates of each process. We assume a flexible random effect distribution, which allows for clustering of the individuals, overdispersion and outliers. Moreover, we employ a graph structure to describe the dependence among processes, exploiting tools from the Gaussian Graphical model literature. It is also possible to include covariate effects. We use our approach to model disease progression in mental health. Posterior inference is performed through a specially devised MCMC algorithm.","PeriodicalId":55398,"journal":{"name":"Bayesian Analysis","volume":" ","pages":""},"PeriodicalIF":4.4,"publicationDate":"2021-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46358294","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Bayesian Multiple Changepoint Detection for Stochastic Models in Continuous Time 连续时间随机模型的贝叶斯多变化点检测
IF 4.4 2区 数学
Bayesian Analysis Pub Date : 2021-06-01 DOI: 10.1214/20-ba1218
Lu Shaochuan
{"title":"Bayesian Multiple Changepoint Detection for Stochastic Models in Continuous Time","authors":"Lu Shaochuan","doi":"10.1214/20-ba1218","DOIUrl":"https://doi.org/10.1214/20-ba1218","url":null,"abstract":"A multiple changepoint model in continuous time is formulated as a continuous-time hidden Markov model, defined on a countable infinite state space. The new formulation of the multiple changepoint model allows the model complexities, i.e. the number of changepoints, to accrue unboundedly upon the arrivals of new data. Inference on the number of changepoints and their locations is based on a collapsed Gibbs sampler. We suggest a new version of forward-filtering backward-sampling (FFBS) algorithm in continuous time for simulating the full trajectory of the latent Markov chain, i.e. the changepoints. The FFBS algorithm is based on a randomized time-discretization for the latent Markov chain through uniformization schemes, combined with a discrete-time version of FFBS algorithm. It is shown that, desirably, both the computational cost and the memory cost of the FFBS algorithm are only quadratic to the number of changepoints. The new formulation of the multiple changepoint models allows varying scale of run lengths of changepoints to be characterized. We demonstrate the methods through simulations and a real data example for earthquakes.","PeriodicalId":55398,"journal":{"name":"Bayesian Analysis","volume":" ","pages":""},"PeriodicalIF":4.4,"publicationDate":"2021-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47835546","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
A Bayesian Approach for Partial Gaussian Graphical Models With Sparsity 具有稀疏性的部分高斯图模型的贝叶斯方法
IF 4.4 2区 数学
Bayesian Analysis Pub Date : 2021-05-23 DOI: 10.1214/22-ba1315
Eunice Okome Obiang, Pascal J'ez'equel, Fr'ed'eric Proia
{"title":"A Bayesian Approach for Partial Gaussian Graphical Models With Sparsity","authors":"Eunice Okome Obiang, Pascal J'ez'equel, Fr'ed'eric Proia","doi":"10.1214/22-ba1315","DOIUrl":"https://doi.org/10.1214/22-ba1315","url":null,"abstract":". We explore various Bayesian approaches to estimate partial Gaussian graphical models. Our hierarchical structures enable to deal with single-output as well as multiple-output linear regressions, in small or high dimension, enforcing either no sparsity, sparsity, group sparsity or even sparse-group sparsity for a bi-level selection through partial correlations (direct links) between predictors and responses, thanks to spike-and-slab priors corresponding to each setting. Adaptative and global shrinkages are also incorporated in the Bayesian modeling of the direct links. An existing result for model selection consistency is reformulated to stick to our sparse and group-sparse settings, providing a theoretical guarantee under some technical assumptions. Gibbs samplers are developed and a simulation study shows the efficiency of our models which give very competitive results, especially in terms of support recovery. To conclude, a real dataset is investigated.","PeriodicalId":55398,"journal":{"name":"Bayesian Analysis","volume":" ","pages":""},"PeriodicalIF":4.4,"publicationDate":"2021-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42692638","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
A Metropolized Adaptive Subspace Algorithm for High-Dimensional Bayesian Variable Selection 一种高维贝叶斯变量选择的大都市自适应子空间算法
IF 4.4 2区 数学
Bayesian Analysis Pub Date : 2021-05-03 DOI: 10.1214/22-BA1351
C. Staerk, M. Kateri, I. Ntzoufras
{"title":"A Metropolized Adaptive Subspace Algorithm for High-Dimensional Bayesian Variable Selection","authors":"C. Staerk, M. Kateri, I. Ntzoufras","doi":"10.1214/22-BA1351","DOIUrl":"https://doi.org/10.1214/22-BA1351","url":null,"abstract":"A simple and efficient adaptive Markov Chain Monte Carlo (MCMC) method, called the Metropolized Adaptive Subspace (MAdaSub) algorithm, is proposed for sampling from high-dimensional posterior model distributions in Bayesian variable selection. The MAdaSub algorithm is based on an independent Metropolis-Hastings sampler, where the individual proposal probabilities of the explanatory variables are updated after each iteration using a form of Bayesian adaptive learning, in a way that they finally converge to the respective covariates’ posterior inclusion probabilities. We prove the ergodicity of the algorithm and present a parallel version of MAdaSub with an adaptation scheme for the proposal probabilities based on the combination of information from multiple chains. The effectiveness of the algorithm is demonstrated via various simulated and real data examples, including a high-dimensional problem with more than 20,000 covariates.","PeriodicalId":55398,"journal":{"name":"Bayesian Analysis","volume":" ","pages":""},"PeriodicalIF":4.4,"publicationDate":"2021-05-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41655167","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Inverse Bayesian Optimization: Learning Human Acquisition Functions in an Exploration vs Exploitation Search Task 逆贝叶斯优化:在探索与开发搜索任务中学习人类习得函数
IF 4.4 2区 数学
Bayesian Analysis Pub Date : 2021-04-16 DOI: 10.1214/21-BA1303
N. Sandholtz, Yohsuke R. Miyamoto, L. Bornn, Maurice A. Smith
{"title":"Inverse Bayesian Optimization: Learning Human Acquisition Functions in an Exploration vs Exploitation Search Task","authors":"N. Sandholtz, Yohsuke R. Miyamoto, L. Bornn, Maurice A. Smith","doi":"10.1214/21-BA1303","DOIUrl":"https://doi.org/10.1214/21-BA1303","url":null,"abstract":". This paper introduces a probabilistic framework to estimate parameters of an acquisition function given observed human behavior that can be mod-eled as a collection of sample paths from a Bayesian optimization procedure. The methodology involves defining a likelihood on observed human behavior from an optimization task, where the likelihood is parameterized by a Bayesian optimization subroutine governed by an unknown acquisition function. This structure enables us to make inference on a subject’s acquisition function while allowing their behavior to deviate around the solution to the Bayesian optimization subroutine. To test our methods, we designed a sequential optimization task which forced subjects to balance exploration and exploitation in search of an invisible target location. Applying our proposed methods to the resulting data, we find that many subjects tend to exhibit exploration preferences beyond that of standard acquisition functions to capture. Guided by the model discrepancies, we augment the candidate acquisition functions to yield a superior fit to the human behavior in this task.","PeriodicalId":55398,"journal":{"name":"Bayesian Analysis","volume":" ","pages":""},"PeriodicalIF":4.4,"publicationDate":"2021-04-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48909293","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Bayesian Functional Principal Components Analysis via Variational Message Passing with Multilevel Extensions 基于多级扩展变分消息传递的贝叶斯函数主成分分析
IF 4.4 2区 数学
Bayesian Analysis Pub Date : 2021-04-01 DOI: 10.1214/23-ba1393
T. Nolan, J. Goldsmith, D. Ruppert
{"title":"Bayesian Functional Principal Components Analysis via Variational Message Passing with Multilevel Extensions","authors":"T. Nolan, J. Goldsmith, D. Ruppert","doi":"10.1214/23-ba1393","DOIUrl":"https://doi.org/10.1214/23-ba1393","url":null,"abstract":"Functional principal components analysis is a popular tool for inference on functional data. Standard approaches rely on an eigendecomposition of a smoothed covariance surface in order to extract the orthonormal functions representing the major modes of variation. This approach can be a computationally intensive procedure, especially in the presence of large datasets with irregular observations. In this article, we develop a Bayesian approach, which aims to determine the Karhunen-Lo`eve decomposition directly without the need to smooth and estimate a covariance surface. More specifically, we develop a variational Bayesian algorithm via message passing over a factor graph, which is more commonly referred to as variational message passing. Message passing algorithms are a powerful tool for compartmentalizing the algebra and coding required for inference in hierarchical statistical models. Recently, there has been much focus on formulating variational inference algorithms in the message passing framework because it removes the need for rederiving approximate posterior density functions if there is a change to the model. Instead, model changes are handled by changing specific computational units, known as fragments, within the factor graph. We extend the notion of variational message passing to functional principal components analysis. Indeed, this is the first article to address a functional data model via variational message passing. Our approach introduces two new fragments that are necessary for Bayesian functional principal components analysis. We present the computational details, a set of simulations for assessing accuracy and speed and an application to United States temperature data.","PeriodicalId":55398,"journal":{"name":"Bayesian Analysis","volume":" ","pages":""},"PeriodicalIF":4.4,"publicationDate":"2021-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49573018","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Bayesian Optimal Experimental Design for Inferring Causal Structure 推理因果结构的贝叶斯最优实验设计
IF 4.4 2区 数学
Bayesian Analysis Pub Date : 2021-03-28 DOI: 10.1214/22-ba1335
M. Zemplenyi, Jeffrey W. Miller
{"title":"Bayesian Optimal Experimental Design for Inferring Causal Structure","authors":"M. Zemplenyi, Jeffrey W. Miller","doi":"10.1214/22-ba1335","DOIUrl":"https://doi.org/10.1214/22-ba1335","url":null,"abstract":"Inferring the causal structure of a system typically requires interventional data, rather than just observational data. Since interventional experiments can be costly, it is preferable to select interventions that yield the maximum amount of information about a system. We propose a novel Bayesian method for optimal experimental design by sequentially selecting interventions that minimize the expected posterior entropy as rapidly as possible. A key feature is that the method can be implemented by computing simple summaries of the current posterior, avoiding the computationally burdensome task of repeatedly performing posterior inference on hypothetical future datasets drawn from the posterior predictive. After deriving the method in a general setting, we apply it to the problem of inferring causal networks. We present a series of simulation studies in which we find that the proposed method performs favorably compared to existing alternative methods. Finally, we apply the method to real and simulated data from a protein-signaling network.","PeriodicalId":55398,"journal":{"name":"Bayesian Analysis","volume":" ","pages":""},"PeriodicalIF":4.4,"publicationDate":"2021-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47434528","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信