{"title":"Approximation in Hilbert spaces of the Gaussian and related analytic kernels","authors":"Toni Karvonen, Yuya Suzuki","doi":"10.1093/imanum/draf050","DOIUrl":null,"url":null,"abstract":"We consider linear approximation based on function evaluations in reproducing kernel Hilbert spaces of certain analytic weighted power series kernels and stationary kernels on the interval $[-1,1]$. Both classes contain the popular Gaussian kernel $K(x, y) = \\exp (-\\tfrac{1}{2}\\varepsilon ^{2}(x-y)^{2})$. For weighted power series kernels we derive almost matching upper and lower bounds on the worst-case error. When applied to the Gaussian kernel our results state that, up to a sub-exponential factor, the $n$th minimal error decays as $(\\varepsilon /2)^{n} (n!)^{-1/2}$. The proofs are based on weighted polynomial interpolation and classical polynomial coefficient estimates that we use to bound the Hilbert space norm of a weighted polynomial fooling function.","PeriodicalId":56295,"journal":{"name":"IMA Journal of Numerical Analysis","volume":"9 1","pages":""},"PeriodicalIF":2.4000,"publicationDate":"2025-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IMA Journal of Numerical Analysis","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1093/imanum/draf050","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
We consider linear approximation based on function evaluations in reproducing kernel Hilbert spaces of certain analytic weighted power series kernels and stationary kernels on the interval $[-1,1]$. Both classes contain the popular Gaussian kernel $K(x, y) = \exp (-\tfrac{1}{2}\varepsilon ^{2}(x-y)^{2})$. For weighted power series kernels we derive almost matching upper and lower bounds on the worst-case error. When applied to the Gaussian kernel our results state that, up to a sub-exponential factor, the $n$th minimal error decays as $(\varepsilon /2)^{n} (n!)^{-1/2}$. The proofs are based on weighted polynomial interpolation and classical polynomial coefficient estimates that we use to bound the Hilbert space norm of a weighted polynomial fooling function.
期刊介绍:
The IMA Journal of Numerical Analysis (IMAJNA) publishes original contributions to all fields of numerical analysis; articles will be accepted which treat the theory, development or use of practical algorithms and interactions between these aspects. Occasional survey articles are also published.