{"title":"论索博廖夫重现核希尔伯特空间中目标数据依赖核贪婪插值的最优性","authors":"Gabriele Santin, Tizian Wenzel, Bernard Haasdonk","doi":"10.1137/23m1587956","DOIUrl":null,"url":null,"abstract":"SIAM Journal on Numerical Analysis, Volume 62, Issue 5, Page 2249-2275, October 2024. <br/> Abstract. Kernel interpolation is a versatile tool for the approximation of functions from data, and it can be proven to have some optimality properties when used with kernels related to certain Sobolev spaces. In the context of interpolation, the selection of optimal function sampling locations is a central problem, both from a practical perspective and as an interesting theoretical question. Greedy interpolation algorithms provide a viable solution for this task, being efficient to run and provably accurate in their approximation. In this paper we close a gap that is present in the convergence theory for these algorithms by employing a recent result on general greedy algorithms. This modification leads to new convergence rates which match the optimal ones when restricted to the [math]-greedy target-data-independent selection rule and can additionally be proven to be optimal when they fully exploit adaptivity ([math]-greedy). Other than closing this gap, the new results have some significance in the broader setting of the optimality of general approximation algorithms in reproducing kernel Hilbert spaces, as they allow us to compare adaptive interpolation with nonadaptive best nonlinear approximation.","PeriodicalId":49527,"journal":{"name":"SIAM Journal on Numerical Analysis","volume":"31 1","pages":""},"PeriodicalIF":2.8000,"publicationDate":"2024-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"On the Optimality of Target-Data-Dependent Kernel Greedy Interpolation in Sobolev Reproducing Kernel Hilbert Spaces\",\"authors\":\"Gabriele Santin, Tizian Wenzel, Bernard Haasdonk\",\"doi\":\"10.1137/23m1587956\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"SIAM Journal on Numerical Analysis, Volume 62, Issue 5, Page 2249-2275, October 2024. <br/> Abstract. Kernel interpolation is a versatile tool for the approximation of functions from data, and it can be proven to have some optimality properties when used with kernels related to certain Sobolev spaces. In the context of interpolation, the selection of optimal function sampling locations is a central problem, both from a practical perspective and as an interesting theoretical question. Greedy interpolation algorithms provide a viable solution for this task, being efficient to run and provably accurate in their approximation. In this paper we close a gap that is present in the convergence theory for these algorithms by employing a recent result on general greedy algorithms. This modification leads to new convergence rates which match the optimal ones when restricted to the [math]-greedy target-data-independent selection rule and can additionally be proven to be optimal when they fully exploit adaptivity ([math]-greedy). Other than closing this gap, the new results have some significance in the broader setting of the optimality of general approximation algorithms in reproducing kernel Hilbert spaces, as they allow us to compare adaptive interpolation with nonadaptive best nonlinear approximation.\",\"PeriodicalId\":49527,\"journal\":{\"name\":\"SIAM Journal on Numerical Analysis\",\"volume\":\"31 1\",\"pages\":\"\"},\"PeriodicalIF\":2.8000,\"publicationDate\":\"2024-09-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"SIAM Journal on Numerical Analysis\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1137/23m1587956\",\"RegionNum\":2,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIAM Journal on Numerical Analysis","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1137/23m1587956","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
On the Optimality of Target-Data-Dependent Kernel Greedy Interpolation in Sobolev Reproducing Kernel Hilbert Spaces
SIAM Journal on Numerical Analysis, Volume 62, Issue 5, Page 2249-2275, October 2024. Abstract. Kernel interpolation is a versatile tool for the approximation of functions from data, and it can be proven to have some optimality properties when used with kernels related to certain Sobolev spaces. In the context of interpolation, the selection of optimal function sampling locations is a central problem, both from a practical perspective and as an interesting theoretical question. Greedy interpolation algorithms provide a viable solution for this task, being efficient to run and provably accurate in their approximation. In this paper we close a gap that is present in the convergence theory for these algorithms by employing a recent result on general greedy algorithms. This modification leads to new convergence rates which match the optimal ones when restricted to the [math]-greedy target-data-independent selection rule and can additionally be proven to be optimal when they fully exploit adaptivity ([math]-greedy). Other than closing this gap, the new results have some significance in the broader setting of the optimality of general approximation algorithms in reproducing kernel Hilbert spaces, as they allow us to compare adaptive interpolation with nonadaptive best nonlinear approximation.
期刊介绍:
SIAM Journal on Numerical Analysis (SINUM) contains research articles on the development and analysis of numerical methods. Topics include the rigorous study of convergence of algorithms, their accuracy, their stability, and their computational complexity. Also included are results in mathematical analysis that contribute to algorithm analysis, and computational results that demonstrate algorithm behavior and applicability.