{"title":"Estimation of Split Points in Misspecified Decision Trees","authors":"J. Escanciano","doi":"10.2139/ssrn.3591411","DOIUrl":null,"url":null,"abstract":"We establish rates of convergence for the least squares estimator of the split point in misspecified decision trees. We close the gap between the known superconsistency rate of the correctly specified case and the slow cube-root convergence of the misspecified smooth regression case. When the true regression function is discontinuous at the split point but not constant on both sides, so the simple binary tree model is misspecified, we recover the superconsistency of the least squares split point estimate and the asymptotic normality at parametric rates of the least squares level coefficients. When the regression function is continuous with a kink at the split point, we obtain rates between superconsistency and cube-root asymptotics, depending on the smoothness of the regression function around the split point. The analysis is extended to threshold regressions, where analogous rate results are obtained. In particular, we show that inference on the slope coefficients is robust to misspecification when a certain regression function is discontinuous at the split point. Monte Carlo simulations confirm the theoretical results.","PeriodicalId":11495,"journal":{"name":"Econometric Modeling: Capital Markets - Forecasting eJournal","volume":"45 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2020-05-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Econometric Modeling: Capital Markets - Forecasting eJournal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2139/ssrn.3591411","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
We establish rates of convergence for the least squares estimator of the split point in misspecified decision trees. We close the gap between the known superconsistency rate of the correctly specified case and the slow cube-root convergence of the misspecified smooth regression case. When the true regression function is discontinuous at the split point but not constant on both sides, so the simple binary tree model is misspecified, we recover the superconsistency of the least squares split point estimate and the asymptotic normality at parametric rates of the least squares level coefficients. When the regression function is continuous with a kink at the split point, we obtain rates between superconsistency and cube-root asymptotics, depending on the smoothness of the regression function around the split point. The analysis is extended to threshold regressions, where analogous rate results are obtained. In particular, we show that inference on the slope coefficients is robust to misspecification when a certain regression function is discontinuous at the split point. Monte Carlo simulations confirm the theoretical results.