{"title":"稳健在线学习的最优性","authors":"Zheng-Chu Guo, Andreas Christmann, Lei Shi","doi":"10.1007/s10208-023-09616-9","DOIUrl":null,"url":null,"abstract":"<p>In this paper, we study an online learning algorithm with a robust loss function <span>\\(\\mathcal {L}_{\\sigma }\\)</span> for regression over a reproducing kernel Hilbert space (RKHS). The loss function <span>\\(\\mathcal {L}_{\\sigma }\\)</span> involving a scaling parameter <span>\\(\\sigma >0\\)</span> can cover a wide range of commonly used robust losses. The proposed algorithm is then a robust alternative for online least squares regression aiming to estimate the conditional mean function. For properly chosen <span>\\(\\sigma \\)</span> and step size, we show that the last iterate of this online algorithm can achieve optimal capacity independent convergence in the mean square distance. Moreover, if additional information on the underlying function space is known, we also establish optimal capacity-dependent rates for strong convergence in RKHS. To the best of our knowledge, both of the two results are new to the existing literature of online learning.</p>","PeriodicalId":55151,"journal":{"name":"Foundations of Computational Mathematics","volume":"31 1","pages":""},"PeriodicalIF":2.5000,"publicationDate":"2023-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Optimality of Robust Online Learning\",\"authors\":\"Zheng-Chu Guo, Andreas Christmann, Lei Shi\",\"doi\":\"10.1007/s10208-023-09616-9\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>In this paper, we study an online learning algorithm with a robust loss function <span>\\\\(\\\\mathcal {L}_{\\\\sigma }\\\\)</span> for regression over a reproducing kernel Hilbert space (RKHS). The loss function <span>\\\\(\\\\mathcal {L}_{\\\\sigma }\\\\)</span> involving a scaling parameter <span>\\\\(\\\\sigma >0\\\\)</span> can cover a wide range of commonly used robust losses. The proposed algorithm is then a robust alternative for online least squares regression aiming to estimate the conditional mean function. For properly chosen <span>\\\\(\\\\sigma \\\\)</span> and step size, we show that the last iterate of this online algorithm can achieve optimal capacity independent convergence in the mean square distance. Moreover, if additional information on the underlying function space is known, we also establish optimal capacity-dependent rates for strong convergence in RKHS. To the best of our knowledge, both of the two results are new to the existing literature of online learning.</p>\",\"PeriodicalId\":55151,\"journal\":{\"name\":\"Foundations of Computational Mathematics\",\"volume\":\"31 1\",\"pages\":\"\"},\"PeriodicalIF\":2.5000,\"publicationDate\":\"2023-07-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Foundations of Computational Mathematics\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1007/s10208-023-09616-9\",\"RegionNum\":1,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, THEORY & METHODS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Foundations of Computational Mathematics","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s10208-023-09616-9","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
In this paper, we study an online learning algorithm with a robust loss function \(\mathcal {L}_{\sigma }\) for regression over a reproducing kernel Hilbert space (RKHS). The loss function \(\mathcal {L}_{\sigma }\) involving a scaling parameter \(\sigma >0\) can cover a wide range of commonly used robust losses. The proposed algorithm is then a robust alternative for online least squares regression aiming to estimate the conditional mean function. For properly chosen \(\sigma \) and step size, we show that the last iterate of this online algorithm can achieve optimal capacity independent convergence in the mean square distance. Moreover, if additional information on the underlying function space is known, we also establish optimal capacity-dependent rates for strong convergence in RKHS. To the best of our knowledge, both of the two results are new to the existing literature of online learning.
期刊介绍:
Foundations of Computational Mathematics (FoCM) will publish research and survey papers of the highest quality which further the understanding of the connections between mathematics and computation. The journal aims to promote the exploration of all fundamental issues underlying the creative tension among mathematics, computer science and application areas unencumbered by any external criteria such as the pressure for applications. The journal will thus serve an increasingly important and applicable area of mathematics. The journal hopes to further the understanding of the deep relationships between mathematical theory: analysis, topology, geometry and algebra, and the computational processes as they are evolving in tandem with the modern computer.
With its distinguished editorial board selecting papers of the highest quality and interest from the international community, FoCM hopes to influence both mathematics and computation. Relevance to applications will not constitute a requirement for the publication of articles.
The journal does not accept code for review however authors who have code/data related to the submission should include a weblink to the repository where the data/code is stored.