P. Berti, E. Dreassi, F. Leisen, P. Rigo, L. Pratelli
{"title":"A Probabilistic View on Predictive Constructions for Bayesian Learning","authors":"P. Berti, E. Dreassi, F. Leisen, P. Rigo, L. Pratelli","doi":"10.1214/23-sts884","DOIUrl":null,"url":null,"abstract":"Given a sequence $X=(X_1,X_2,\\ldots)$ of random observations, a Bayesian forecaster aims to predict $X_{n+1}$ based on $(X_1,\\ldots,X_n)$ for each $n\\ge 0$. To this end, in principle, she only needs to select a collection $\\sigma=(\\sigma_0,\\sigma_1,\\ldots)$, called ``strategy\"in what follows, where $\\sigma_0(\\cdot)=P(X_1\\in\\cdot)$ is the marginal distribution of $X_1$ and $\\sigma_n(\\cdot)=P(X_{n+1}\\in\\cdot\\mid X_1,\\ldots,X_n)$ the $n$-th predictive distribution. Because of the Ionescu-Tulcea theorem, $\\sigma$ can be assigned directly, without passing through the usual prior/posterior scheme. One main advantage is that no prior probability is to be selected. In a nutshell, this is the predictive approach to Bayesian learning. A concise review of the latter is provided in this paper. We try to put such an approach in the right framework, to make clear a few misunderstandings, and to provide a unifying view. Some recent results are discussed as well. In addition, some new strategies are introduced and the corresponding distribution of the data sequence $X$ is determined. The strategies concern generalized P\\'olya urns, random change points, covariates and stationary sequences.","PeriodicalId":51172,"journal":{"name":"Statistical Science","volume":" ","pages":""},"PeriodicalIF":3.9000,"publicationDate":"2022-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Statistical Science","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1214/23-sts884","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
引用次数: 0
Abstract
Given a sequence $X=(X_1,X_2,\ldots)$ of random observations, a Bayesian forecaster aims to predict $X_{n+1}$ based on $(X_1,\ldots,X_n)$ for each $n\ge 0$. To this end, in principle, she only needs to select a collection $\sigma=(\sigma_0,\sigma_1,\ldots)$, called ``strategy"in what follows, where $\sigma_0(\cdot)=P(X_1\in\cdot)$ is the marginal distribution of $X_1$ and $\sigma_n(\cdot)=P(X_{n+1}\in\cdot\mid X_1,\ldots,X_n)$ the $n$-th predictive distribution. Because of the Ionescu-Tulcea theorem, $\sigma$ can be assigned directly, without passing through the usual prior/posterior scheme. One main advantage is that no prior probability is to be selected. In a nutshell, this is the predictive approach to Bayesian learning. A concise review of the latter is provided in this paper. We try to put such an approach in the right framework, to make clear a few misunderstandings, and to provide a unifying view. Some recent results are discussed as well. In addition, some new strategies are introduced and the corresponding distribution of the data sequence $X$ is determined. The strategies concern generalized P\'olya urns, random change points, covariates and stationary sequences.
期刊介绍:
The central purpose of Statistical Science is to convey the richness, breadth and unity of the field by presenting the full range of contemporary statistical thought at a moderate technical level, accessible to the wide community of practitioners, researchers and students of statistics and probability.