{"title":"Functional Adaptive Huber Linear Regression","authors":"Ling Peng, Xiaohui Liu, Heng Lian","doi":"arxiv-2409.11053","DOIUrl":null,"url":null,"abstract":"Robust estimation has played an important role in statistical and machine\nlearning. However, its applications to functional linear regression are still\nunder-developed. In this paper, we focus on Huber's loss with a diverging\nrobustness parameter which was previously used in parametric models. Compared\nto other robust methods such as median regression, the distinction is that the\nproposed method aims to estimate the conditional mean robustly, instead of\nestimating the conditional median. We only require $(1+\\kappa)$-th moment\nassumption ($\\kappa>0$) on the noise distribution, and the established error\nbounds match the optimal rate in the least-squares case as soon as $\\kappa\\ge\n1$. We establish convergence rate in probability when the functional predictor\nhas a finite 4-th moment, and finite-sample bound with exponential tail when\nthe functional predictor is Gaussian, in terms of both prediction error and\n$L^2$ error. The results also extend to the case of functional estimation in a\nreproducing kernel Hilbert space (RKHS).","PeriodicalId":501379,"journal":{"name":"arXiv - STAT - Statistics Theory","volume":"33 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - STAT - Statistics Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.11053","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Robust estimation has played an important role in statistical and machine
learning. However, its applications to functional linear regression are still
under-developed. In this paper, we focus on Huber's loss with a diverging
robustness parameter which was previously used in parametric models. Compared
to other robust methods such as median regression, the distinction is that the
proposed method aims to estimate the conditional mean robustly, instead of
estimating the conditional median. We only require $(1+\kappa)$-th moment
assumption ($\kappa>0$) on the noise distribution, and the established error
bounds match the optimal rate in the least-squares case as soon as $\kappa\ge
1$. We establish convergence rate in probability when the functional predictor
has a finite 4-th moment, and finite-sample bound with exponential tail when
the functional predictor is Gaussian, in terms of both prediction error and
$L^2$ error. The results also extend to the case of functional estimation in a
reproducing kernel Hilbert space (RKHS).