Antoine Godichon-Baggioni , Stéphane Robin , Laure Sansonnet
{"title":"在线和离线鲁棒多元线性回归","authors":"Antoine Godichon-Baggioni , Stéphane Robin , Laure Sansonnet","doi":"10.1016/j.csda.2026.108341","DOIUrl":null,"url":null,"abstract":"<div><div>The robust estimation of the parameters of multivariate Gaussian linear regression models is considered by using robust versions of the usual (Mahalanobis) least-square criterion, with or without Ridge regularization. Two methods of estimation are introduced: (i) online stochastic gradient descent algorithms and their averaged variants, and (ii) offline fixed-point algorithms. These methods are applied to both the standard and Mahalanobis least-squares criteria, as well as to their regularized counterparts. Under weak assumptions, the resulting estimators are shown to be asymptotically normal. Since the noise covariance matrix is generally unknown, a robust estimate of this matrix is incorporated into the Mahalanobis-based stochastic gradient descent algorithms. Numerical experiments on synthetic data demonstrate a substantial gain in robustness compared with classical least-squares estimators, while also highlighting the computational efficiency of the online procedures. All proposed algorithms are implemented in the <span>R</span> package <span>RobRegression</span>, available on CRAN.</div></div>","PeriodicalId":55225,"journal":{"name":"Computational Statistics & Data Analysis","volume":"218 ","pages":"Article 108341"},"PeriodicalIF":1.6000,"publicationDate":"2026-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Online and offline robust multivariate linear regression\",\"authors\":\"Antoine Godichon-Baggioni , Stéphane Robin , Laure Sansonnet\",\"doi\":\"10.1016/j.csda.2026.108341\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>The robust estimation of the parameters of multivariate Gaussian linear regression models is considered by using robust versions of the usual (Mahalanobis) least-square criterion, with or without Ridge regularization. Two methods of estimation are introduced: (i) online stochastic gradient descent algorithms and their averaged variants, and (ii) offline fixed-point algorithms. These methods are applied to both the standard and Mahalanobis least-squares criteria, as well as to their regularized counterparts. Under weak assumptions, the resulting estimators are shown to be asymptotically normal. Since the noise covariance matrix is generally unknown, a robust estimate of this matrix is incorporated into the Mahalanobis-based stochastic gradient descent algorithms. Numerical experiments on synthetic data demonstrate a substantial gain in robustness compared with classical least-squares estimators, while also highlighting the computational efficiency of the online procedures. All proposed algorithms are implemented in the <span>R</span> package <span>RobRegression</span>, available on CRAN.</div></div>\",\"PeriodicalId\":55225,\"journal\":{\"name\":\"Computational Statistics & Data Analysis\",\"volume\":\"218 \",\"pages\":\"Article 108341\"},\"PeriodicalIF\":1.6000,\"publicationDate\":\"2026-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computational Statistics & Data Analysis\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0167947326000034\",\"RegionNum\":3,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2026/1/17 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computational Statistics & Data Analysis","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0167947326000034","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2026/1/17 0:00:00","PubModel":"Epub","JCR":"Q3","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
Online and offline robust multivariate linear regression
The robust estimation of the parameters of multivariate Gaussian linear regression models is considered by using robust versions of the usual (Mahalanobis) least-square criterion, with or without Ridge regularization. Two methods of estimation are introduced: (i) online stochastic gradient descent algorithms and their averaged variants, and (ii) offline fixed-point algorithms. These methods are applied to both the standard and Mahalanobis least-squares criteria, as well as to their regularized counterparts. Under weak assumptions, the resulting estimators are shown to be asymptotically normal. Since the noise covariance matrix is generally unknown, a robust estimate of this matrix is incorporated into the Mahalanobis-based stochastic gradient descent algorithms. Numerical experiments on synthetic data demonstrate a substantial gain in robustness compared with classical least-squares estimators, while also highlighting the computational efficiency of the online procedures. All proposed algorithms are implemented in the R package RobRegression, available on CRAN.
期刊介绍:
Computational Statistics and Data Analysis (CSDA), an Official Publication of the network Computational and Methodological Statistics (CMStatistics) and of the International Association for Statistical Computing (IASC), is an international journal dedicated to the dissemination of methodological research and applications in the areas of computational statistics and data analysis. The journal consists of four refereed sections which are divided into the following subject areas:
I) Computational Statistics - Manuscripts dealing with: 1) the explicit impact of computers on statistical methodology (e.g., Bayesian computing, bioinformatics,computer graphics, computer intensive inferential methods, data exploration, data mining, expert systems, heuristics, knowledge based systems, machine learning, neural networks, numerical and optimization methods, parallel computing, statistical databases, statistical systems), and 2) the development, evaluation and validation of statistical software and algorithms. Software and algorithms can be submitted with manuscripts and will be stored together with the online article.
II) Statistical Methodology for Data Analysis - Manuscripts dealing with novel and original data analytical strategies and methodologies applied in biostatistics (design and analytic methods for clinical trials, epidemiological studies, statistical genetics, or genetic/environmental interactions), chemometrics, classification, data exploration, density estimation, design of experiments, environmetrics, education, image analysis, marketing, model free data exploration, pattern recognition, psychometrics, statistical physics, image processing, robust procedures.
[...]
III) Special Applications - [...]
IV) Annals of Statistical Data Science [...]