功能自适应胡贝尔线性回归

Ling Peng, Xiaohui Liu, Heng Lian
{"title":"功能自适应胡贝尔线性回归","authors":"Ling Peng, Xiaohui Liu, Heng Lian","doi":"arxiv-2409.11053","DOIUrl":null,"url":null,"abstract":"Robust estimation has played an important role in statistical and machine\nlearning. However, its applications to functional linear regression are still\nunder-developed. In this paper, we focus on Huber's loss with a diverging\nrobustness parameter which was previously used in parametric models. Compared\nto other robust methods such as median regression, the distinction is that the\nproposed method aims to estimate the conditional mean robustly, instead of\nestimating the conditional median. We only require $(1+\\kappa)$-th moment\nassumption ($\\kappa>0$) on the noise distribution, and the established error\nbounds match the optimal rate in the least-squares case as soon as $\\kappa\\ge\n1$. We establish convergence rate in probability when the functional predictor\nhas a finite 4-th moment, and finite-sample bound with exponential tail when\nthe functional predictor is Gaussian, in terms of both prediction error and\n$L^2$ error. The results also extend to the case of functional estimation in a\nreproducing kernel Hilbert space (RKHS).","PeriodicalId":501379,"journal":{"name":"arXiv - STAT - Statistics Theory","volume":"33 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Functional Adaptive Huber Linear Regression\",\"authors\":\"Ling Peng, Xiaohui Liu, Heng Lian\",\"doi\":\"arxiv-2409.11053\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Robust estimation has played an important role in statistical and machine\\nlearning. However, its applications to functional linear regression are still\\nunder-developed. In this paper, we focus on Huber's loss with a diverging\\nrobustness parameter which was previously used in parametric models. Compared\\nto other robust methods such as median regression, the distinction is that the\\nproposed method aims to estimate the conditional mean robustly, instead of\\nestimating the conditional median. We only require $(1+\\\\kappa)$-th moment\\nassumption ($\\\\kappa>0$) on the noise distribution, and the established error\\nbounds match the optimal rate in the least-squares case as soon as $\\\\kappa\\\\ge\\n1$. We establish convergence rate in probability when the functional predictor\\nhas a finite 4-th moment, and finite-sample bound with exponential tail when\\nthe functional predictor is Gaussian, in terms of both prediction error and\\n$L^2$ error. The results also extend to the case of functional estimation in a\\nreproducing kernel Hilbert space (RKHS).\",\"PeriodicalId\":501379,\"journal\":{\"name\":\"arXiv - STAT - Statistics Theory\",\"volume\":\"33 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - STAT - Statistics Theory\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.11053\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - STAT - Statistics Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.11053","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

稳健估计在统计和机器学习中发挥了重要作用。然而,其在函数线性回归中的应用仍未得到充分发展。在本文中,我们将重点放在带有发散稳健性参数的 Huber 损失上,该参数以前曾用于参数模型。与其他稳健方法(如中值回归)相比,本文的区别在于,本文提出的方法旨在稳健地估计条件均值,而不是估计条件中值。我们只需要噪声分布上的 $(1+\kappa)$-th moment 假设($\kappa>0$),只要$\kappa\ge1$,所建立的误差边界就与最小二乘情况下的最优率相匹配。当函数预测器具有有限 4th 矩时,我们建立了概率收敛率;当函数预测器为高斯时,我们建立了具有指数尾部的有限样本约束。这些结果还扩展到了在产生核希尔伯特空间(RKHS)中进行函数估计的情况。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Functional Adaptive Huber Linear Regression
Robust estimation has played an important role in statistical and machine learning. However, its applications to functional linear regression are still under-developed. In this paper, we focus on Huber's loss with a diverging robustness parameter which was previously used in parametric models. Compared to other robust methods such as median regression, the distinction is that the proposed method aims to estimate the conditional mean robustly, instead of estimating the conditional median. We only require $(1+\kappa)$-th moment assumption ($\kappa>0$) on the noise distribution, and the established error bounds match the optimal rate in the least-squares case as soon as $\kappa\ge 1$. We establish convergence rate in probability when the functional predictor has a finite 4-th moment, and finite-sample bound with exponential tail when the functional predictor is Gaussian, in terms of both prediction error and $L^2$ error. The results also extend to the case of functional estimation in a reproducing kernel Hilbert space (RKHS).
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信