{"title":"A New Statistic for Bayesian Hypothesis Testing","authors":"Su Chen , Stephen G. Walker","doi":"10.1016/j.ecosta.2021.10.009","DOIUrl":null,"url":null,"abstract":"<div><p>A new Bayesian–inspired statistic<span><span> for hypothesis testing<span> is proposed which compares two posterior distributions; the observed posterior and the expected posterior under the </span></span>null<span> model. The Kullback–Leibler divergence between the two posterior distributions yields a test statistic which can be interpreted as a penalized log–Bayes factor with the penalty term converging to a constant as the sample size increases. Hence, asymptotically, the statistic behaves as a Bayes factor<span>. Viewed as a penalized Bayes factor, this approach solves the long standing issue of using improper priors with the Bayes factor, since only posterior summaries are needed for the new statistic. Further motivation for the new statistic is a minimal move from the Bayes factor which requires no tuning nor splitting of data into training and inference, and can use improper priors. Critical regions for the test can be assessed using frequentist notions of Type I error.</span></span></span></p></div>","PeriodicalId":54125,"journal":{"name":"Econometrics and Statistics","volume":"26 ","pages":"Pages 139-152"},"PeriodicalIF":2.0000,"publicationDate":"2023-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Econometrics and Statistics","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2452306221001246","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ECONOMICS","Score":null,"Total":0}
引用次数: 1
Abstract
A new Bayesian–inspired statistic for hypothesis testing is proposed which compares two posterior distributions; the observed posterior and the expected posterior under the null model. The Kullback–Leibler divergence between the two posterior distributions yields a test statistic which can be interpreted as a penalized log–Bayes factor with the penalty term converging to a constant as the sample size increases. Hence, asymptotically, the statistic behaves as a Bayes factor. Viewed as a penalized Bayes factor, this approach solves the long standing issue of using improper priors with the Bayes factor, since only posterior summaries are needed for the new statistic. Further motivation for the new statistic is a minimal move from the Bayes factor which requires no tuning nor splitting of data into training and inference, and can use improper priors. Critical regions for the test can be assessed using frequentist notions of Type I error.
期刊介绍:
Econometrics and Statistics is the official journal of the networks Computational and Financial Econometrics and Computational and Methodological Statistics. It publishes research papers in all aspects of econometrics and statistics and comprises of the two sections Part A: Econometrics and Part B: Statistics.