{"title":"<ArticleTitle xmlns:ns0=\"http://www.w3.org/1998/Math/MathML\">Error analysis for <ns0:math><ns0:msup><ns0:mi>l</ns0:mi> <ns0:mi>q</ns0:mi></ns0:msup> </ns0:math> -coefficient regularized moving least-square regression.","authors":"Qin Guo, Peixin Ye","doi":"10.1186/s13660-018-1856-y","DOIUrl":null,"url":null,"abstract":"<p><p>We consider the moving least-square (MLS) method by the coefficient-based regression framework with <math><msup><mi>l</mi> <mi>q</mi></msup> </math> -regularizer <math><mo>(</mo> <mn>1</mn> <mo>≤</mo> <mi>q</mi> <mo>≤</mo> <mn>2</mn> <mo>)</mo></math> and the sample dependent hypothesis spaces. The data dependent characteristic of the new algorithm provides flexibility and adaptivity for MLS. We carry out a rigorous error analysis by using the stepping stone technique in the error decomposition. The concentration technique with the <math><msup><mi>l</mi> <mn>2</mn></msup> </math> -empirical covering number is also employed in our study to improve the sample error. We derive the satisfactory learning rate that can be arbitrarily close to the best rate <math><mi>O</mi> <mo>(</mo> <msup><mi>m</mi> <mrow><mo>-</mo> <mn>1</mn></mrow> </msup> <mo>)</mo></math> under more natural and much simpler conditions.</p>","PeriodicalId":49163,"journal":{"name":"Journal of Inequalities and Applications","volume":"2018 1","pages":"262"},"PeriodicalIF":1.6000,"publicationDate":"2018-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1186/s13660-018-1856-y","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Inequalities and Applications","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1186/s13660-018-1856-y","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2018/9/25 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"Mathematics","Score":null,"Total":0}
引用次数: 0
Abstract
We consider the moving least-square (MLS) method by the coefficient-based regression framework with -regularizer and the sample dependent hypothesis spaces. The data dependent characteristic of the new algorithm provides flexibility and adaptivity for MLS. We carry out a rigorous error analysis by using the stepping stone technique in the error decomposition. The concentration technique with the -empirical covering number is also employed in our study to improve the sample error. We derive the satisfactory learning rate that can be arbitrarily close to the best rate under more natural and much simpler conditions.
期刊介绍:
The aim of this journal is to provide a multi-disciplinary forum of discussion in mathematics and its applications in which the essentiality of inequalities is highlighted. This Journal accepts high quality articles containing original research results and survey articles of exceptional merit. Subject matters should be strongly related to inequalities, such as, but not restricted to, the following: inequalities in analysis, inequalities in approximation theory, inequalities in combinatorics, inequalities in economics, inequalities in geometry, inequalities in mechanics, inequalities in optimization, inequalities in stochastic analysis and applications.