{"title":"再生核Hilbert空间中函数线性回归的梯度迭代方法","authors":"Hongzhi Tong, Michael Ng","doi":"10.4208/aam.oa-2021-0016","DOIUrl":null,"url":null,"abstract":". We consider a gradient iteration algorithm for prediction of functional linear regression under the framework of reproducing kernel Hilbert spaces. In the algorithm, we use an early stopping technique, instead of the classical Tikhonov regularization, to prevent the iteration from an overfitting function. Under mild conditions, we obtain upper bounds, essentially matching the known minimax lower bounds, for excess prediction risk. An almost sure convergence is also established for the proposed algorithm.","PeriodicalId":58853,"journal":{"name":"应用数学年刊:英文版","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Gradient Iteration Method for Functional Linear Regression in Reproducing Kernel Hilbert Spaces\",\"authors\":\"Hongzhi Tong, Michael Ng\",\"doi\":\"10.4208/aam.oa-2021-0016\",\"DOIUrl\":null,\"url\":null,\"abstract\":\". We consider a gradient iteration algorithm for prediction of functional linear regression under the framework of reproducing kernel Hilbert spaces. In the algorithm, we use an early stopping technique, instead of the classical Tikhonov regularization, to prevent the iteration from an overfitting function. Under mild conditions, we obtain upper bounds, essentially matching the known minimax lower bounds, for excess prediction risk. An almost sure convergence is also established for the proposed algorithm.\",\"PeriodicalId\":58853,\"journal\":{\"name\":\"应用数学年刊:英文版\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"应用数学年刊:英文版\",\"FirstCategoryId\":\"1089\",\"ListUrlMain\":\"https://doi.org/10.4208/aam.oa-2021-0016\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"应用数学年刊:英文版","FirstCategoryId":"1089","ListUrlMain":"https://doi.org/10.4208/aam.oa-2021-0016","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A Gradient Iteration Method for Functional Linear Regression in Reproducing Kernel Hilbert Spaces
. We consider a gradient iteration algorithm for prediction of functional linear regression under the framework of reproducing kernel Hilbert spaces. In the algorithm, we use an early stopping technique, instead of the classical Tikhonov regularization, to prevent the iteration from an overfitting function. Under mild conditions, we obtain upper bounds, essentially matching the known minimax lower bounds, for excess prediction risk. An almost sure convergence is also established for the proposed algorithm.