{"title":"因果推理的双鲁棒准则","authors":"Takamichi Baba, Yoshiyuki Ninomiya","doi":"10.1002/cjs.70001","DOIUrl":null,"url":null,"abstract":"<p>In causal inference, semiparametric estimation using propensity scores has rapidly developed in various directions. At the same time, although model selection is indispensable in statistical analysis, an information criterion for selecting the regression structure between the potential outcome and explanatory variables has not been well developed. Here, based on the original definition of AIC, we derive an AIC-type criterion for propensity score analysis. A risk based on the Kullback–Leibler divergence is defined as the cornerstone, and general causal inference models and general causal effects are treated. Considering the high importance of doubly robust estimation, we make the information criterion itself doubly robust so that it is an asymptotically unbiased estimator of the risk even under some model misspecification. In simulation studies, we compare the derived criterion with an existing weighted quasi-likelihood information criterion and confirm that the former outperforms the latter. Real data analyses indicate that results using the two criteria can differ significantly.</p>","PeriodicalId":55281,"journal":{"name":"Canadian Journal of Statistics-Revue Canadienne De Statistique","volume":"53 3","pages":""},"PeriodicalIF":1.0000,"publicationDate":"2025-03-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/cjs.70001","citationCount":"0","resultStr":"{\"title\":\"Doubly robust criterion for causal inference\",\"authors\":\"Takamichi Baba, Yoshiyuki Ninomiya\",\"doi\":\"10.1002/cjs.70001\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>In causal inference, semiparametric estimation using propensity scores has rapidly developed in various directions. At the same time, although model selection is indispensable in statistical analysis, an information criterion for selecting the regression structure between the potential outcome and explanatory variables has not been well developed. Here, based on the original definition of AIC, we derive an AIC-type criterion for propensity score analysis. A risk based on the Kullback–Leibler divergence is defined as the cornerstone, and general causal inference models and general causal effects are treated. Considering the high importance of doubly robust estimation, we make the information criterion itself doubly robust so that it is an asymptotically unbiased estimator of the risk even under some model misspecification. In simulation studies, we compare the derived criterion with an existing weighted quasi-likelihood information criterion and confirm that the former outperforms the latter. Real data analyses indicate that results using the two criteria can differ significantly.</p>\",\"PeriodicalId\":55281,\"journal\":{\"name\":\"Canadian Journal of Statistics-Revue Canadienne De Statistique\",\"volume\":\"53 3\",\"pages\":\"\"},\"PeriodicalIF\":1.0000,\"publicationDate\":\"2025-03-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://onlinelibrary.wiley.com/doi/epdf/10.1002/cjs.70001\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Canadian Journal of Statistics-Revue Canadienne De Statistique\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1002/cjs.70001\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"STATISTICS & PROBABILITY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Canadian Journal of Statistics-Revue Canadienne De Statistique","FirstCategoryId":"100","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/cjs.70001","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
In causal inference, semiparametric estimation using propensity scores has rapidly developed in various directions. At the same time, although model selection is indispensable in statistical analysis, an information criterion for selecting the regression structure between the potential outcome and explanatory variables has not been well developed. Here, based on the original definition of AIC, we derive an AIC-type criterion for propensity score analysis. A risk based on the Kullback–Leibler divergence is defined as the cornerstone, and general causal inference models and general causal effects are treated. Considering the high importance of doubly robust estimation, we make the information criterion itself doubly robust so that it is an asymptotically unbiased estimator of the risk even under some model misspecification. In simulation studies, we compare the derived criterion with an existing weighted quasi-likelihood information criterion and confirm that the former outperforms the latter. Real data analyses indicate that results using the two criteria can differ significantly.
期刊介绍:
The Canadian Journal of Statistics is the official journal of the Statistical Society of Canada. It has a reputation internationally as an excellent journal. The editorial board is comprised of statistical scientists with applied, computational, methodological, theoretical and probabilistic interests. Their role is to ensure that the journal continues to provide an international forum for the discipline of Statistics.
The journal seeks papers making broad points of interest to many readers, whereas papers making important points of more specific interest are better placed in more specialized journals. The levels of innovation and impact are key in the evaluation of submitted manuscripts.