{"title":"证明基于核的变分最小二乘法的稳定性估计值","authors":"Meng Chen, Leevan Ling, Dongfang Yun","doi":"arxiv-2312.07080","DOIUrl":null,"url":null,"abstract":"Motivated by the need for the rigorous analysis of the numerical stability of\nvariational least-squares kernel-based methods for solving second-order\nelliptic partial differential equations, we provide previously lacking\nstability inequalities. This fills a significant theoretical gap in the\nprevious work [Comput. Math. Appl. 103 (2021) 1-11], which provided error\nestimates based on a conjecture on the stability. With the stability estimate\nnow rigorously proven, we complete the theoretical foundations and compare the\nconvergence behavior to the proven rates. Furthermore, we establish another\nstability inequality involving weighted-discrete norms, and provide a\ntheoretical proof demonstrating that the exact quadrature weights are not\nnecessary for the weighted least-squares kernel-based collocation method to\nconverge. Our novel theoretical insights are validated by numerical examples,\nwhich showcase the relative efficiency and accuracy of these methods on data\nsets with large mesh ratios. The results confirm our theoretical predictions\nregarding the performance of variational least-squares kernel-based method,\nleast-squares kernel-based collocation method, and our new weighted\nleast-squares kernel-based collocation method. Most importantly, our results\ndemonstrate that all methods converge at the same rate, validating the\nconvergence theory of weighted least-squares in our proven theories.","PeriodicalId":501061,"journal":{"name":"arXiv - CS - Numerical Analysis","volume":"169 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Proving the stability estimates of variational least-squares Kernel-Based methods\",\"authors\":\"Meng Chen, Leevan Ling, Dongfang Yun\",\"doi\":\"arxiv-2312.07080\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Motivated by the need for the rigorous analysis of the numerical stability of\\nvariational least-squares kernel-based methods for solving second-order\\nelliptic partial differential equations, we provide previously lacking\\nstability inequalities. This fills a significant theoretical gap in the\\nprevious work [Comput. Math. Appl. 103 (2021) 1-11], which provided error\\nestimates based on a conjecture on the stability. With the stability estimate\\nnow rigorously proven, we complete the theoretical foundations and compare the\\nconvergence behavior to the proven rates. Furthermore, we establish another\\nstability inequality involving weighted-discrete norms, and provide a\\ntheoretical proof demonstrating that the exact quadrature weights are not\\nnecessary for the weighted least-squares kernel-based collocation method to\\nconverge. Our novel theoretical insights are validated by numerical examples,\\nwhich showcase the relative efficiency and accuracy of these methods on data\\nsets with large mesh ratios. The results confirm our theoretical predictions\\nregarding the performance of variational least-squares kernel-based method,\\nleast-squares kernel-based collocation method, and our new weighted\\nleast-squares kernel-based collocation method. Most importantly, our results\\ndemonstrate that all methods converge at the same rate, validating the\\nconvergence theory of weighted least-squares in our proven theories.\",\"PeriodicalId\":501061,\"journal\":{\"name\":\"arXiv - CS - Numerical Analysis\",\"volume\":\"169 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-12-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Numerical Analysis\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2312.07080\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Numerical Analysis","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2312.07080","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Proving the stability estimates of variational least-squares Kernel-Based methods
Motivated by the need for the rigorous analysis of the numerical stability of
variational least-squares kernel-based methods for solving second-order
elliptic partial differential equations, we provide previously lacking
stability inequalities. This fills a significant theoretical gap in the
previous work [Comput. Math. Appl. 103 (2021) 1-11], which provided error
estimates based on a conjecture on the stability. With the stability estimate
now rigorously proven, we complete the theoretical foundations and compare the
convergence behavior to the proven rates. Furthermore, we establish another
stability inequality involving weighted-discrete norms, and provide a
theoretical proof demonstrating that the exact quadrature weights are not
necessary for the weighted least-squares kernel-based collocation method to
converge. Our novel theoretical insights are validated by numerical examples,
which showcase the relative efficiency and accuracy of these methods on data
sets with large mesh ratios. The results confirm our theoretical predictions
regarding the performance of variational least-squares kernel-based method,
least-squares kernel-based collocation method, and our new weighted
least-squares kernel-based collocation method. Most importantly, our results
demonstrate that all methods converge at the same rate, validating the
convergence theory of weighted least-squares in our proven theories.