统计学期刊(英文)Pub Date : 2011-10-01DOI: 10.4236/ojs.2011.13021
Chen-Pin Wang, Malay Ghosh
{"title":"A Kullback-Leibler Divergence for Bayesian Model Diagnostics.","authors":"Chen-Pin Wang, Malay Ghosh","doi":"10.4236/ojs.2011.13021","DOIUrl":"https://doi.org/10.4236/ojs.2011.13021","url":null,"abstract":"<p><p>This paper considers a Kullback-Leibler distance (KLD) which is asymptotically equivalent to the KLD by Goutis and Robert [1] when the reference model (in comparison to a competing fitted model) is correctly specified and that certain regularity conditions hold true (ref. Akaike [2]). We derive the asymptotic property of this Goutis-Robert-Akaike KLD under certain regularity conditions. We also examine the impact of this asymptotic property when the regularity conditions are partially satisfied. Furthermore, the connection between the Goutis-Robert-Akaike KLD and a weighted posterior predictive p-value (WPPP) is established. Finally, both the Goutis-Robert-Akaike KLD and WPPP are applied to compare models using various simulated examples as well as two cohort studies of diabetes.</p>","PeriodicalId":59624,"journal":{"name":"统计学期刊(英文)","volume":"1 3","pages":"172-184"},"PeriodicalIF":0.0,"publicationDate":"2011-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4235748/pdf/nihms570131.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"32829270","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}