{"title":"Feasible Generalized Least Squares Using Machine Learning","authors":"Steve Miller, R. Startz","doi":"10.2139/ssrn.2966194","DOIUrl":null,"url":null,"abstract":"In the presence of heteroskedastic errors, regression using Feasible Generalized Least Squares (FGLS) offers potential efficiency gains over Ordinary Least Squares (OLS). However, FGLS adoption remains limited, in part because the form of heteroskedasticity may be misspecified. We investigate machine learning methods to address this concern, focusing on Support Vector Regression. Monte Carlo results indicate the resulting estimator and an accompanying standard error correction offer substantially improved precision, nominal coverage rates, and shorter confidence intervals than OLS with heteroskedasticity-consistent (HC3) standard errors. Reductions in root mean squared error are over 90% of those achievable when the form of heteroskedasticity is known.","PeriodicalId":425229,"journal":{"name":"ERN: Hypothesis Testing (Topic)","volume":"139 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-02-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"27","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ERN: Hypothesis Testing (Topic)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2139/ssrn.2966194","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 27
Abstract
In the presence of heteroskedastic errors, regression using Feasible Generalized Least Squares (FGLS) offers potential efficiency gains over Ordinary Least Squares (OLS). However, FGLS adoption remains limited, in part because the form of heteroskedasticity may be misspecified. We investigate machine learning methods to address this concern, focusing on Support Vector Regression. Monte Carlo results indicate the resulting estimator and an accompanying standard error correction offer substantially improved precision, nominal coverage rates, and shorter confidence intervals than OLS with heteroskedasticity-consistent (HC3) standard errors. Reductions in root mean squared error are over 90% of those achievable when the form of heteroskedasticity is known.