{"title":"与方差分解模型相关的多变量雅可比多项式回归估计器","authors":"","doi":"10.1007/s00184-024-00954-4","DOIUrl":null,"url":null,"abstract":"<h3>Abstract</h3> <p>In this work, we construct a stable and fairly fast estimator for solving multidimensional non-parametric regression problems. The proposed estimator is based on the use of a novel and special system of multivariate Jacobi polynomials that generate a basis for a reduced size of <span> <span>\\(d-\\)</span> </span>variate finite dimensional polynomials space. An ANOVA decomposition trick has been used for building this space. Also, by using some results from the theory of positive definite random matrices, we show that the proposed estimator is stable under the condition that the i.i.d. <span> <span>\\(d-\\)</span> </span>dimensional random sampling training points follow a <span> <span>\\(d-\\)</span> </span>dimensional Beta distribution. In addition, we provide the reader with an estimate for the <span> <span>\\(L^2-\\)</span> </span>risk error of the estimator. This risk error depends on the <span> <span>\\(L^2-\\)</span> </span>error of the orthogonal projection error of the regression function over the considered polynomials space. An involved study of this orthogonal projection error is done under the condition that the regression function belongs to a given weighted Sobolev space. Thanks to this novel estimate of the orthogonal projection error, we give the optimal convergence rate of our estimator. Furthermore, we give a regularized extension version of our estimator, that is capable of handling random sampling training vectors drawn according to an unknown multivariate pdf. Moreover, we derive an upper bound for the empirical risk error of this regularized estimator. Finally, we give some numerical simulations that illustrate the various theoretical results of this work. In particular, we provide simulations on a real data that compares the performance of our estimator with some existing and popular NP regression estimators.</p>","PeriodicalId":49821,"journal":{"name":"Metrika","volume":null,"pages":null},"PeriodicalIF":0.9000,"publicationDate":"2024-02-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A multivariate Jacobi polynomials regression estimator associated with an ANOVA decomposition model\",\"authors\":\"\",\"doi\":\"10.1007/s00184-024-00954-4\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<h3>Abstract</h3> <p>In this work, we construct a stable and fairly fast estimator for solving multidimensional non-parametric regression problems. The proposed estimator is based on the use of a novel and special system of multivariate Jacobi polynomials that generate a basis for a reduced size of <span> <span>\\\\(d-\\\\)</span> </span>variate finite dimensional polynomials space. An ANOVA decomposition trick has been used for building this space. Also, by using some results from the theory of positive definite random matrices, we show that the proposed estimator is stable under the condition that the i.i.d. <span> <span>\\\\(d-\\\\)</span> </span>dimensional random sampling training points follow a <span> <span>\\\\(d-\\\\)</span> </span>dimensional Beta distribution. In addition, we provide the reader with an estimate for the <span> <span>\\\\(L^2-\\\\)</span> </span>risk error of the estimator. This risk error depends on the <span> <span>\\\\(L^2-\\\\)</span> </span>error of the orthogonal projection error of the regression function over the considered polynomials space. An involved study of this orthogonal projection error is done under the condition that the regression function belongs to a given weighted Sobolev space. Thanks to this novel estimate of the orthogonal projection error, we give the optimal convergence rate of our estimator. Furthermore, we give a regularized extension version of our estimator, that is capable of handling random sampling training vectors drawn according to an unknown multivariate pdf. Moreover, we derive an upper bound for the empirical risk error of this regularized estimator. Finally, we give some numerical simulations that illustrate the various theoretical results of this work. In particular, we provide simulations on a real data that compares the performance of our estimator with some existing and popular NP regression estimators.</p>\",\"PeriodicalId\":49821,\"journal\":{\"name\":\"Metrika\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.9000,\"publicationDate\":\"2024-02-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Metrika\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1007/s00184-024-00954-4\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"STATISTICS & PROBABILITY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Metrika","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s00184-024-00954-4","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
A multivariate Jacobi polynomials regression estimator associated with an ANOVA decomposition model
Abstract
In this work, we construct a stable and fairly fast estimator for solving multidimensional non-parametric regression problems. The proposed estimator is based on the use of a novel and special system of multivariate Jacobi polynomials that generate a basis for a reduced size of \(d-\)variate finite dimensional polynomials space. An ANOVA decomposition trick has been used for building this space. Also, by using some results from the theory of positive definite random matrices, we show that the proposed estimator is stable under the condition that the i.i.d. \(d-\)dimensional random sampling training points follow a \(d-\)dimensional Beta distribution. In addition, we provide the reader with an estimate for the \(L^2-\)risk error of the estimator. This risk error depends on the \(L^2-\)error of the orthogonal projection error of the regression function over the considered polynomials space. An involved study of this orthogonal projection error is done under the condition that the regression function belongs to a given weighted Sobolev space. Thanks to this novel estimate of the orthogonal projection error, we give the optimal convergence rate of our estimator. Furthermore, we give a regularized extension version of our estimator, that is capable of handling random sampling training vectors drawn according to an unknown multivariate pdf. Moreover, we derive an upper bound for the empirical risk error of this regularized estimator. Finally, we give some numerical simulations that illustrate the various theoretical results of this work. In particular, we provide simulations on a real data that compares the performance of our estimator with some existing and popular NP regression estimators.
期刊介绍:
Metrika is an international journal for theoretical and applied statistics. Metrika publishes original research papers in the field of mathematical statistics and statistical methods. Great importance is attached to new developments in theoretical statistics, statistical modeling and to actual innovative applicability of the proposed statistical methods and results. Topics of interest include, without being limited to, multivariate analysis, high dimensional statistics and nonparametric statistics; categorical data analysis and latent variable models; reliability, lifetime data analysis and statistics in engineering sciences.