{"title":"维度对核脊回归估计器收敛率的影响","authors":"Kwan-Young Bak, Woojoo Lee","doi":"10.1016/j.jspi.2024.106228","DOIUrl":null,"url":null,"abstract":"Despite the curse of dimensionality, kernel ridge regression often exhibits good performance in practical applications, even when the dimension is moderately large. However, it has been shown that kernel ridge regression cannot be free from the curse of dimensionality. Until now, the literature on kernel ridge regression has suggested that the gap between theory and practice in relation to dimensionality has not narrowed. In this study, we first investigate when the influence of dimensionality does not significantly affect the convergence rate of the kernel ridge regression. Specifically, we study the convergence rate of and risks for the kernel ridge estimator, with a focus on reproducing kernel Hilbert space (RKHS) generated by a product kernel. We show that the univariate optimal convergence rate up to a logarithmic factor in and risks can be achieved by controlling the size of the RKHS. The result of a numerical study confirms our theoretical findings.","PeriodicalId":50039,"journal":{"name":"Journal of Statistical Planning and Inference","volume":"42 1","pages":""},"PeriodicalIF":0.8000,"publicationDate":"2024-08-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Effect of dimensionality on convergence rates of kernel ridge regression estimator\",\"authors\":\"Kwan-Young Bak, Woojoo Lee\",\"doi\":\"10.1016/j.jspi.2024.106228\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Despite the curse of dimensionality, kernel ridge regression often exhibits good performance in practical applications, even when the dimension is moderately large. However, it has been shown that kernel ridge regression cannot be free from the curse of dimensionality. Until now, the literature on kernel ridge regression has suggested that the gap between theory and practice in relation to dimensionality has not narrowed. In this study, we first investigate when the influence of dimensionality does not significantly affect the convergence rate of the kernel ridge regression. Specifically, we study the convergence rate of and risks for the kernel ridge estimator, with a focus on reproducing kernel Hilbert space (RKHS) generated by a product kernel. We show that the univariate optimal convergence rate up to a logarithmic factor in and risks can be achieved by controlling the size of the RKHS. The result of a numerical study confirms our theoretical findings.\",\"PeriodicalId\":50039,\"journal\":{\"name\":\"Journal of Statistical Planning and Inference\",\"volume\":\"42 1\",\"pages\":\"\"},\"PeriodicalIF\":0.8000,\"publicationDate\":\"2024-08-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Statistical Planning and Inference\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1016/j.jspi.2024.106228\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"STATISTICS & PROBABILITY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Statistical Planning and Inference","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1016/j.jspi.2024.106228","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
Effect of dimensionality on convergence rates of kernel ridge regression estimator
Despite the curse of dimensionality, kernel ridge regression often exhibits good performance in practical applications, even when the dimension is moderately large. However, it has been shown that kernel ridge regression cannot be free from the curse of dimensionality. Until now, the literature on kernel ridge regression has suggested that the gap between theory and practice in relation to dimensionality has not narrowed. In this study, we first investigate when the influence of dimensionality does not significantly affect the convergence rate of the kernel ridge regression. Specifically, we study the convergence rate of and risks for the kernel ridge estimator, with a focus on reproducing kernel Hilbert space (RKHS) generated by a product kernel. We show that the univariate optimal convergence rate up to a logarithmic factor in and risks can be achieved by controlling the size of the RKHS. The result of a numerical study confirms our theoretical findings.
期刊介绍:
The Journal of Statistical Planning and Inference offers itself as a multifaceted and all-inclusive bridge between classical aspects of statistics and probability, and the emerging interdisciplinary aspects that have a potential of revolutionizing the subject. While we maintain our traditional strength in statistical inference, design, classical probability, and large sample methods, we also have a far more inclusive and broadened scope to keep up with the new problems that confront us as statisticians, mathematicians, and scientists.
We publish high quality articles in all branches of statistics, probability, discrete mathematics, machine learning, and bioinformatics. We also especially welcome well written and up to date review articles on fundamental themes of statistics, probability, machine learning, and general biostatistics. Thoughtful letters to the editors, interesting problems in need of a solution, and short notes carrying an element of elegance or beauty are equally welcome.