{"title":"图形后验预测分类:粒子吉布斯贝叶斯平均模型","authors":"Tatjana Pavlenko, Felix Rios","doi":"10.1090/tpms/1198","DOIUrl":null,"url":null,"abstract":"In this study, we present a multi-class graphical Bayesian predictive classifier that incorporates the uncertainty in the model selection into the standard Bayesian formalism. For each class, the dependence structure underlying the observed features is represented by a set of decomposable Gaussian graphical models. Emphasis is then placed on the <italic>Bayesian model averaging</italic> which takes full account of the class-specific model uncertainty by averaging over the posterior graph model probabilities. An explicit evaluation of the model probabilities is well known to be infeasible. To address this issue, we consider the particle Gibbs strategy of J. Olsson, T. Pavlenko, and F. L. Rios [Electron. J. Statist. 13 (2019), no. 2, 2865–2897] for posterior sampling from decomposable graphical models which utilizes the so-called <italic>Christmas tree algorithm</italic> of J. Olsson, T. Pavlenko, and F. L. Rios [Stat. Comput. 32 (2022), no. 5, Paper No. 80, 18] as proposal kernel. We also derive a strong hyper Markov law which we call the <italic>hyper normal Wishart law</italic> that allows to perform the resultant Bayesian calculations locally. The proposed predictive graphical classifier reveals superior performance compared to the ordinary Bayesian predictive rule that does not account for the model uncertainty, as well as to a number of out-of-the-box classifiers.","PeriodicalId":42776,"journal":{"name":"Theory of Probability and Mathematical Statistics","volume":"98 1","pages":"0"},"PeriodicalIF":0.4000,"publicationDate":"2023-10-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Graphical posterior predictive classification: Bayesian model averaging with particle Gibbs\",\"authors\":\"Tatjana Pavlenko, Felix Rios\",\"doi\":\"10.1090/tpms/1198\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this study, we present a multi-class graphical Bayesian predictive classifier that incorporates the uncertainty in the model selection into the standard Bayesian formalism. For each class, the dependence structure underlying the observed features is represented by a set of decomposable Gaussian graphical models. Emphasis is then placed on the <italic>Bayesian model averaging</italic> which takes full account of the class-specific model uncertainty by averaging over the posterior graph model probabilities. An explicit evaluation of the model probabilities is well known to be infeasible. To address this issue, we consider the particle Gibbs strategy of J. Olsson, T. Pavlenko, and F. L. Rios [Electron. J. Statist. 13 (2019), no. 2, 2865–2897] for posterior sampling from decomposable graphical models which utilizes the so-called <italic>Christmas tree algorithm</italic> of J. Olsson, T. Pavlenko, and F. L. Rios [Stat. Comput. 32 (2022), no. 5, Paper No. 80, 18] as proposal kernel. We also derive a strong hyper Markov law which we call the <italic>hyper normal Wishart law</italic> that allows to perform the resultant Bayesian calculations locally. The proposed predictive graphical classifier reveals superior performance compared to the ordinary Bayesian predictive rule that does not account for the model uncertainty, as well as to a number of out-of-the-box classifiers.\",\"PeriodicalId\":42776,\"journal\":{\"name\":\"Theory of Probability and Mathematical Statistics\",\"volume\":\"98 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.4000,\"publicationDate\":\"2023-10-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Theory of Probability and Mathematical Statistics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1090/tpms/1198\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"STATISTICS & PROBABILITY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Theory of Probability and Mathematical Statistics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1090/tpms/1198","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
引用次数: 0
摘要
在这项研究中,我们提出了一个多类图形贝叶斯预测分类器,将模型选择中的不确定性纳入标准贝叶斯形式。对于每一类,观察到的特征的依赖结构由一组可分解的高斯图形模型表示。然后将重点放在贝叶斯模型平均上,该模型通过平均后验图模型概率,充分考虑了特定类别的模型不确定性。众所周知,对模型概率的明确评估是不可行的。为了解决这个问题,我们考虑了J. Olsson, T. Pavlenko和F. L. Rios [Electron]的粒子吉布斯策略。统计学家。13 (2019),no。J. Olsson, T. Pavlenko和F. L. Rios的所谓圣诞树算法[Stat. Comput. 32 (2022), no. 1],用于可分解图形模型的后测抽样。5、论文第80,18号作为提案核心。我们还推导了一个强超马尔可夫定律,我们称之为超正态Wishart定律,它允许在局部执行所得到的贝叶斯计算。与不考虑模型不确定性的普通贝叶斯预测规则以及一些开箱即用的分类器相比,所提出的预测图形分类器显示出优越的性能。
Graphical posterior predictive classification: Bayesian model averaging with particle Gibbs
In this study, we present a multi-class graphical Bayesian predictive classifier that incorporates the uncertainty in the model selection into the standard Bayesian formalism. For each class, the dependence structure underlying the observed features is represented by a set of decomposable Gaussian graphical models. Emphasis is then placed on the Bayesian model averaging which takes full account of the class-specific model uncertainty by averaging over the posterior graph model probabilities. An explicit evaluation of the model probabilities is well known to be infeasible. To address this issue, we consider the particle Gibbs strategy of J. Olsson, T. Pavlenko, and F. L. Rios [Electron. J. Statist. 13 (2019), no. 2, 2865–2897] for posterior sampling from decomposable graphical models which utilizes the so-called Christmas tree algorithm of J. Olsson, T. Pavlenko, and F. L. Rios [Stat. Comput. 32 (2022), no. 5, Paper No. 80, 18] as proposal kernel. We also derive a strong hyper Markov law which we call the hyper normal Wishart law that allows to perform the resultant Bayesian calculations locally. The proposed predictive graphical classifier reveals superior performance compared to the ordinary Bayesian predictive rule that does not account for the model uncertainty, as well as to a number of out-of-the-box classifiers.