{"title":"Uniform Estimation and Inference for Nonparametric Partitioning-Based M-Estimators","authors":"Matias D. Cattaneo, Yingjie Feng, Boris Shigida","doi":"arxiv-2409.05715","DOIUrl":null,"url":null,"abstract":"This paper presents uniform estimation and inference theory for a large class\nof nonparametric partitioning-based M-estimators. The main theoretical results\ninclude: (i) uniform consistency for convex and non-convex objective functions;\n(ii) optimal uniform Bahadur representations; (iii) optimal uniform (and mean\nsquare) convergence rates; (iv) valid strong approximations and feasible\nuniform inference methods; and (v) extensions to functional transformations of\nunderlying estimators. Uniformity is established over both the evaluation point\nof the nonparametric functional parameter and a Euclidean parameter indexing\nthe class of loss functions. The results also account explicitly for the\nsmoothness degree of the loss function (if any), and allow for a possibly\nnon-identity (inverse) link function. We illustrate the main theoretical and\nmethodological results with four substantive applications: quantile regression,\ndistribution regression, $L_p$ regression, and Logistic regression; many other\npossibly non-smooth, nonlinear, generalized, robust M-estimation settings are\ncovered by our theoretical results. We provide detailed comparisons with the\nexisting literature and demonstrate substantive improvements: we achieve the\nbest (in some cases optimal) known results under improved (in some cases\nminimal) requirements in terms of regularity conditions and side rate\nrestrictions. The supplemental appendix reports other technical results that\nmay be of independent interest.","PeriodicalId":501293,"journal":{"name":"arXiv - ECON - Econometrics","volume":"26 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - ECON - Econometrics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.05715","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
This paper presents uniform estimation and inference theory for a large class
of nonparametric partitioning-based M-estimators. The main theoretical results
include: (i) uniform consistency for convex and non-convex objective functions;
(ii) optimal uniform Bahadur representations; (iii) optimal uniform (and mean
square) convergence rates; (iv) valid strong approximations and feasible
uniform inference methods; and (v) extensions to functional transformations of
underlying estimators. Uniformity is established over both the evaluation point
of the nonparametric functional parameter and a Euclidean parameter indexing
the class of loss functions. The results also account explicitly for the
smoothness degree of the loss function (if any), and allow for a possibly
non-identity (inverse) link function. We illustrate the main theoretical and
methodological results with four substantive applications: quantile regression,
distribution regression, $L_p$ regression, and Logistic regression; many other
possibly non-smooth, nonlinear, generalized, robust M-estimation settings are
covered by our theoretical results. We provide detailed comparisons with the
existing literature and demonstrate substantive improvements: we achieve the
best (in some cases optimal) known results under improved (in some cases
minimal) requirements in terms of regularity conditions and side rate
restrictions. The supplemental appendix reports other technical results that
may be of independent interest.