Bayesian Analysis最新文献

筛选
英文 中文
An Explanatory Rationale for Priors Sharpened Into Occam’s Razors 奥卡姆剃刀上的先验理论解释
IF 4.4 2区 数学
Bayesian Analysis Pub Date : 2020-12-01 DOI: 10.1214/19-ba1189
D. Bickel
{"title":"An Explanatory Rationale for Priors Sharpened Into Occam’s Razors","authors":"D. Bickel","doi":"10.1214/19-ba1189","DOIUrl":"https://doi.org/10.1214/19-ba1189","url":null,"abstract":". In Bayesian statistics, if the distribution of the data is unknown, then each plausible distribution of the data is indexed by a parameter value, and the prior distribution of the parameter is specified. To the extent that more compli-cated data distributions tend to require more coincidences for their construction than simpler data distributions, default prior distributions should be transformed to assign additional prior probability or probability density to the parameter values that refer to simpler data distributions. The proposed transformation of the prior distribution relies on the entropy of each data distribution as the relevant measure of complexity. The transformation is derived from a few first principles and extended to stochastic processes.","PeriodicalId":55398,"journal":{"name":"Bayesian Analysis","volume":" ","pages":""},"PeriodicalIF":4.4,"publicationDate":"2020-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49203283","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Post-Processed Posteriors for Banded Covariances 带协方差的后处理后验算子
IF 4.4 2区 数学
Bayesian Analysis Pub Date : 2020-11-25 DOI: 10.1214/22-ba1333
Kwangmin Lee, Kyoungjae Lee, Jaeyong Lee
{"title":"Post-Processed Posteriors for Banded Covariances","authors":"Kwangmin Lee, Kyoungjae Lee, Jaeyong Lee","doi":"10.1214/22-ba1333","DOIUrl":"https://doi.org/10.1214/22-ba1333","url":null,"abstract":"We consider Bayesian inference of banded covariance matrices and propose a post-processed posterior. The post-processing of the posterior consists of two steps. In the first step, posterior samples are obtained from the conjugate inverse-Wishart posterior which does not satisfy any structural restrictions. In the second step, the posterior samples are transformed to satisfy the structural restriction through a post-processing function. The conceptually straightforward procedure of the post-processed posterior makes its computation efficient and can render interval estimators of functionals of covariance matrices. We show that it has nearly optimal minimax rates for banded covariances among all possible pairs of priors and post-processing functions. Furthermore, we prove that the expected coverage probability of the $(1-alpha)100%$ highest posterior density region of the post-processed posterior is asymptotically $1-alpha$ with respect to a conventional posterior distribution. It implies that the highest posterior density region of the post-processed posterior is, on average, a credible set of a conventional posterior. The advantages of the post-processed posterior are demonstrated by a simulation study and a real data analysis.","PeriodicalId":55398,"journal":{"name":"Bayesian Analysis","volume":" ","pages":""},"PeriodicalIF":4.4,"publicationDate":"2020-11-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43532988","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Gaussian Orthogonal Latent Factor Processes for Large Incomplete Matrices of Correlated Data 相关数据大不完全矩阵的高斯正交潜在因子处理
IF 4.4 2区 数学
Bayesian Analysis Pub Date : 2020-11-21 DOI: 10.1214/21-ba1295
Mengyang Gu, Hanmo Li
{"title":"Gaussian Orthogonal Latent Factor Processes for Large Incomplete Matrices of Correlated Data","authors":"Mengyang Gu, Hanmo Li","doi":"10.1214/21-ba1295","DOIUrl":"https://doi.org/10.1214/21-ba1295","url":null,"abstract":"We introduce the Gaussian orthogonal latent factor processes for modeling and predicting large correlated data. To handle the computational challenge, we first decompose the likelihood function of the Gaussian random field with multi-dimensional input domain into a product of densities at the orthogonal components with lower dimensional inputs. The continuous-time Kalman filter is implemented to efficiently compute the likelihood function without making approximation. We also show that the posterior distribution of the factor processes are independent, as a consequence of prior independence of factor processes and orthogonal factor loading matrix. For studies with a large sample size, we propose a flexible way to model the mean in the model and derive the closed-form marginal posterior distribution. Both simulated and real data applications confirm the outstanding performance of this method.","PeriodicalId":55398,"journal":{"name":"Bayesian Analysis","volume":" ","pages":""},"PeriodicalIF":4.4,"publicationDate":"2020-11-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43086651","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Functional Central Limit Theorems for Stick-Breaking Priors 断棒先验的泛函中心极限定理
IF 4.4 2区 数学
Bayesian Analysis Pub Date : 2020-11-19 DOI: 10.1214/21-ba1290
Yaozhong Hu, Junxi Zhang
{"title":"Functional Central Limit Theorems for Stick-Breaking Priors","authors":"Yaozhong Hu, Junxi Zhang","doi":"10.1214/21-ba1290","DOIUrl":"https://doi.org/10.1214/21-ba1290","url":null,"abstract":"We obtain the empirical strong law of large numbers, empirical Glivenko-Cantelli theorem, central limit theorem, functional central limit theorem for various nonparametric Bayesian priors which include the Dirichlet process with general stick-breaking weights, the Poisson-Dirichlet process, the normalized inverse Gaussian process, the normalized generalized gamma process, and the generalized Dirichlet process. For the Dirichlet process with general stick-breaking weights, we introduce two general conditions such that the central limit theorem and functional central limit theorem hold. Except in the case of the generalized Dirichlet process, since the finite dimensional distributions of these processes are either hard to obtain or are complicated to use even they are available, we use the method of moments to obtain the convergence results. For the generalized Dirichlet process we use its finite dimensional marginal distributions to obtain the asymptotics although the computations are highly technical.","PeriodicalId":55398,"journal":{"name":"Bayesian Analysis","volume":" ","pages":""},"PeriodicalIF":4.4,"publicationDate":"2020-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43103298","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
R ∗ : A Robust MCMC Convergence Diagnostic with Uncertainty Using Decision Tree Classifiers R *:基于决策树分类器的鲁棒不确定性MCMC收敛诊断
IF 4.4 2区 数学
Bayesian Analysis Pub Date : 2020-11-19 DOI: 10.1214/20-ba1252
Ben Lambert, Aki Vehtari
{"title":"R ∗ : A Robust MCMC Convergence Diagnostic with Uncertainty Using Decision Tree Classifiers","authors":"Ben Lambert, Aki Vehtari","doi":"10.1214/20-ba1252","DOIUrl":"https://doi.org/10.1214/20-ba1252","url":null,"abstract":"Markov chain Monte Carlo (MCMC) has transformed Bayesian model inference over the past three decades: mainly because of this, Bayesian inference is now a workhorse of applied scientists. Under general conditions, MCMC sampling converges asymptotically to the posterior distribution, but this provides no guarantees about its performance in finite time. The predominant method for monitoring convergence is to run multiple chains and monitor individual chains' characteristics and compare these to the population as a whole: if within-chain and between-chain summaries are comparable, then this is taken to indicate that the chains have converged to a common stationary distribution. Here, we introduce a new method for diagnosing convergence based on how well a machine learning classifier model can successfully discriminate the individual chains. We call this convergence measure $R^*$. In contrast to the predominant $widehat{R}$, $R^*$ is a single statistic across all parameters that indicates lack of mixing, although individual variables' importance for this metric can also be determined. Additionally, $R^*$ is not based on any single characteristic of the sampling distribution; instead it uses all the information in the chain, including that given by the joint sampling distribution, which is currently largely overlooked by existing approaches. We recommend calculating $R^*$ using two different machine learning classifiers - gradient-boosted regression trees and random forests - which each work well in models of different dimensions. Because each of these methods outputs a classification probability, as a byproduct, we obtain uncertainty in $R^*$. The method is straightforward to implement and could be a complementary additional check on MCMC convergence for applied analyses.","PeriodicalId":55398,"journal":{"name":"Bayesian Analysis","volume":" ","pages":""},"PeriodicalIF":4.4,"publicationDate":"2020-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46532990","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 14
Joint Bayesian Analysis of Multiple Response-Types Using the Hierarchical Generalized Transformation Model 基于层次广义变换模型的多响应类型联合贝叶斯分析
IF 4.4 2区 数学
Bayesian Analysis Pub Date : 2020-11-01 DOI: 10.1214/20-ba1246
J. Bradley
{"title":"Joint Bayesian Analysis of Multiple Response-Types Using the Hierarchical Generalized Transformation Model","authors":"J. Bradley","doi":"10.1214/20-ba1246","DOIUrl":"https://doi.org/10.1214/20-ba1246","url":null,"abstract":"Consider the situation where an analyst has a Bayesian statistical model that performs well for continuous data. However, suppose the observed dataset consists of multiple response-types (e.g., continuous, count-valued, Bernoulli trials, etc.), which are distributed from more than one class of distributions. We refer to these types of data as “multiple response-type” datasets. The goal of this article is to introduce a reasonable easy-to-implement all-purpose method that “converts” a Bayesian statistical model for continuous responses (call this the preferred model) into a Bayesian model for multiple response-type datasets. To do this, we consider a transformation of the multiple response-type data, such that the transformed data can be reasonably modeled using the preferred model. What is unique with our strategy is that we treat the transformations as unknown and use a Bayesian approach to model this uncertainty. The implementation of our Bayesian approach to unknown transformations is straightforward, and involves two steps. The first step produces posterior replicates of the transformed multiple responsetype data from a latent conjugate multivariate (LCM) model. The second step involves generating values from the posterior distribution implied by the preferred model. We demonstrate the flexibility of our model through an application to Bayesian additive regression trees (BART) and a spatio-temporal mixed effects (SME) model.We provide a thorough joint multiple response-type spatio-temporal analysis of coronavirus disease 2019 (COVID-19) cases, the adjusted closing price of the Dow Jones Industrial (DJI), and Google Trends data. © 2022 International Society for Bayesian Analysis","PeriodicalId":55398,"journal":{"name":"Bayesian Analysis","volume":"1 1","pages":""},"PeriodicalIF":4.4,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"66080991","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
High-Dimensional Bayesian Network Classification with Network Global-Local Shrinkage Priors 具有全局局部收缩先验的高维贝叶斯网络分类
IF 4.4 2区 数学
Bayesian Analysis Pub Date : 2020-09-23 DOI: 10.1214/23-ba1378
Sharmistha Guha, Abel Rodríguez
{"title":"High-Dimensional Bayesian Network Classification with Network Global-Local Shrinkage Priors","authors":"Sharmistha Guha, Abel Rodríguez","doi":"10.1214/23-ba1378","DOIUrl":"https://doi.org/10.1214/23-ba1378","url":null,"abstract":"This article proposes a novel Bayesian classification framework for networks with labeled nodes. While literature on statistical modeling of network data typically involves analysis of a single network, the recent emergence of complex data in several biological applications, including brain imaging studies, presents a need to devise a network classifier for subjects. This article considers an application from a brain connectome study, where the overarching goal is to classify subjects into two separate groups based on their brain network data, along with identifying influential regions of interest (ROIs) (referred to as nodes). Existing approaches either treat all edge weights as a long vector or summarize the network information with a few summary measures. Both these approaches ignore the full network structure, may lead to less desirable inference in small samples and are not designed to identify significant network nodes. We propose a novel binary logistic regression framework with the network as the predictor and a binary response, the network predictor coefficient being modeled using a novel class global-local shrinkage priors. The framework is able to accurately detect nodes and edges in the network influencing the classification. Our framework is implemented using an efficient Markov Chain Monte Carlo algorithm. Theoretically, we show asymptotically optimal classification for the proposed framework when the number of network edges grows faster than the sample size. The framework is empirically validated by extensive simulation studies and analysis of a brain connectome data.","PeriodicalId":55398,"journal":{"name":"Bayesian Analysis","volume":" ","pages":""},"PeriodicalIF":4.4,"publicationDate":"2020-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43499599","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Independent Finite Approximations for Bayesian Nonparametric Inference 贝叶斯非参数推理的独立有限逼近
IF 4.4 2区 数学
Bayesian Analysis Pub Date : 2020-09-22 DOI: 10.1214/23-ba1385
Tin D. Nguyen, Jonathan Huggins, L. Masoero, Lester W. Mackey, Tamara Broderick
{"title":"Independent Finite Approximations for Bayesian Nonparametric Inference","authors":"Tin D. Nguyen, Jonathan Huggins, L. Masoero, Lester W. Mackey, Tamara Broderick","doi":"10.1214/23-ba1385","DOIUrl":"https://doi.org/10.1214/23-ba1385","url":null,"abstract":"Bayesian nonparametric priors based on completely random measures (CRMs) offer a flexible modeling approach when the number of latent components in a dataset is unknown. However, managing the infinite dimensionality of CRMs typically requires practitioners to derive ad-hoc algorithms, preventing the use of general-purpose inference methods and often leading to long compute times. We propose a general but explicit recipe to construct a simple finite-dimensional approximation that can replace the infinite-dimensional CRMs. Our independent finite approximation (IFA) is a generalization of important cases that are used in practice. The independence of atom weights in our approximation (i) makes the construction well-suited for parallel and distributed computation and (ii) facilitates more convenient inference schemes. We quantify the approximation error between IFAs and the target nonparametric prior. We compare IFAs with an alternative approximation scheme -- truncated finite approximations (TFAs), where the atom weights are constructed sequentially. We prove that, for worst-case choices of observation likelihoods, TFAs are a more efficient approximation than IFAs. However, in real-data experiments with image denoising and topic modeling, we find that IFAs perform very similarly to TFAs in terms of task-specific accuracy metrics.","PeriodicalId":55398,"journal":{"name":"Bayesian Analysis","volume":" ","pages":""},"PeriodicalIF":4.4,"publicationDate":"2020-09-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43520200","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Bayesian Nonparametric Bivariate Survival Regression for Current Status Data 现状数据的贝叶斯非参数双变量生存回归
IF 4.4 2区 数学
Bayesian Analysis Pub Date : 2020-09-14 DOI: 10.1214/22-ba1346
Giorgio Paulon, Peter Muller, V. S. Y. Rosas
{"title":"Bayesian Nonparametric Bivariate Survival Regression for Current Status Data","authors":"Giorgio Paulon, Peter Muller, V. S. Y. Rosas","doi":"10.1214/22-ba1346","DOIUrl":"https://doi.org/10.1214/22-ba1346","url":null,"abstract":"We consider nonparametric inference for event time distributions based on current status data. We show that in this scenario conventional mixture priors, including the popular Dirichlet process mixture prior, lead to biologically uninterpretable results as they unnaturally skew the probability mass for the event times toward the extremes of the observed data. Simple assumptions on dependent censoring can fix the problem. We then extend the discussion to bivariate current status data with partial ordering of the two outcomes. In addition to dependent censoring, we also exploit some minimal known structure relating the two event times. We design a Markov chain Monte Carlo algorithm for posterior simulation. Applied to a recurrent infection study, the method provides novel insights into how symptoms-related hospital visits are affected by covariates.","PeriodicalId":55398,"journal":{"name":"Bayesian Analysis","volume":" ","pages":""},"PeriodicalIF":4.4,"publicationDate":"2020-09-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46053700","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Bayesian Modelling of Time-Varying Conditional Heteroscedasticity 时变条件异方差的贝叶斯模型
IF 4.4 2区 数学
Bayesian Analysis Pub Date : 2020-09-13 DOI: 10.1214/21-BA1267
Sayar Karmakar, Arkaprava Roy
{"title":"Bayesian Modelling of Time-Varying Conditional Heteroscedasticity","authors":"Sayar Karmakar, Arkaprava Roy","doi":"10.1214/21-BA1267","DOIUrl":"https://doi.org/10.1214/21-BA1267","url":null,"abstract":"Conditional heteroscedastic (CH) models are routinely used to analyze financial datasets. The classical models such as ARCH-GARCH with time-invariant coefficients are often inadequate to describe frequent changes over time due to market variability. However we can achieve significantly better insight by considering the time-varying analogues of these models. In this paper, we propose a Bayesian approach to the estimation of such models and develop computationally efficient MCMC algorithm based on Hamiltonian Monte Carlo (HMC) sampling. We also established posterior contraction rates with increasing sample size in terms of the average Hellinger metric. The performance of our method is compared with frequentist estimates and estimates from the time constant analogues. To conclude the paper we obtain time-varying parameter estimates for some popular Forex (currency conversion rate) and stock market datasets.","PeriodicalId":55398,"journal":{"name":"Bayesian Analysis","volume":" ","pages":""},"PeriodicalIF":4.4,"publicationDate":"2020-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44938543","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信