{"title":"Robust and computationally efficient gradient-based estimation","authors":"Yibo Yan , Xiaozhou Wang , Riquan Zhang","doi":"10.1016/j.jspi.2025.106351","DOIUrl":null,"url":null,"abstract":"<div><div>In this paper, we propose a class of estimators based on the robust and computationally efficient gradient estimation for both low- and high-dimensional risk minimization framework. The gradient estimation in this work is constructed using a series of newly proposed univariate robust and efficient mean estimators. Our proposed estimators are obtained iteratively using a variant of the gradient descent method, where the update direction is determined by a robust and computationally efficient gradient. These estimators not only have explicit expressions and can be obtained through arithmetic operations but are also robust to arbitrary outliers in common statistical models. Theoretically, we establish the convergence of the algorithms and derive non-asymptotic error bounds for these iterative estimators. Specifically, we apply our methods to linear and logistic regression models, achieving robust parameter estimates and corresponding excess risk bounds. Unlike previous work, our theoretical results rely on a magnitude function of the outliers, which captures the extent of their deviation from the inliers. Finally, we present extensive simulation experiments on both low- and high-dimensional linear models to demonstrate the superior performance of our proposed estimators compared to several baseline methods.</div></div>","PeriodicalId":50039,"journal":{"name":"Journal of Statistical Planning and Inference","volume":"242 ","pages":"Article 106351"},"PeriodicalIF":0.8000,"publicationDate":"2025-09-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Statistical Planning and Inference","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0378375825000898","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
引用次数: 0
Abstract
In this paper, we propose a class of estimators based on the robust and computationally efficient gradient estimation for both low- and high-dimensional risk minimization framework. The gradient estimation in this work is constructed using a series of newly proposed univariate robust and efficient mean estimators. Our proposed estimators are obtained iteratively using a variant of the gradient descent method, where the update direction is determined by a robust and computationally efficient gradient. These estimators not only have explicit expressions and can be obtained through arithmetic operations but are also robust to arbitrary outliers in common statistical models. Theoretically, we establish the convergence of the algorithms and derive non-asymptotic error bounds for these iterative estimators. Specifically, we apply our methods to linear and logistic regression models, achieving robust parameter estimates and corresponding excess risk bounds. Unlike previous work, our theoretical results rely on a magnitude function of the outliers, which captures the extent of their deviation from the inliers. Finally, we present extensive simulation experiments on both low- and high-dimensional linear models to demonstrate the superior performance of our proposed estimators compared to several baseline methods.
期刊介绍:
The Journal of Statistical Planning and Inference offers itself as a multifaceted and all-inclusive bridge between classical aspects of statistics and probability, and the emerging interdisciplinary aspects that have a potential of revolutionizing the subject. While we maintain our traditional strength in statistical inference, design, classical probability, and large sample methods, we also have a far more inclusive and broadened scope to keep up with the new problems that confront us as statisticians, mathematicians, and scientists.
We publish high quality articles in all branches of statistics, probability, discrete mathematics, machine learning, and bioinformatics. We also especially welcome well written and up to date review articles on fundamental themes of statistics, probability, machine learning, and general biostatistics. Thoughtful letters to the editors, interesting problems in need of a solution, and short notes carrying an element of elegance or beauty are equally welcome.