{"title":"内部分最小平方:对 \"必要 \"降维的探索","authors":"Yunjian Yin, Lan Liu","doi":"10.1016/j.jmva.2024.105356","DOIUrl":null,"url":null,"abstract":"<div><p>The partial least square (PLS) algorithm retains the combinations of predictors that maximize the covariance with the outcome. Cook et al. (2013) showed that PLS results in a predictor envelope, which is the smallest reducing subspace of predictors’ covariance that contains the coefficient. However, PLS and predictor envelope both target at a space that contains the regression coefficients and therefore they may sometimes be too conservative to reduce the dimension of the predictors. In this paper, we propose a new method that may improve the estimation efficiency of regression coefficients when both PLS and predictor envelope fail to do so. Specifically, our method results in the largest reducing subspace of predictors’ covariance that is contained in the coefficient matrix space. Interestingly, the moment based algorithm of our proposed method can be achieved by changing the max in PLS to min. We define the modified PLS as the inner PLS and the resulting space as the inner predictor envelope space. We provide the theoretical properties of our proposed methods as well as demonstrate their use in China Health and Nutrition Survey.</p></div>","PeriodicalId":16431,"journal":{"name":"Journal of Multivariate Analysis","volume":null,"pages":null},"PeriodicalIF":1.4000,"publicationDate":"2024-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The inner partial least square: An exploration of the “necessary” dimension reduction\",\"authors\":\"Yunjian Yin, Lan Liu\",\"doi\":\"10.1016/j.jmva.2024.105356\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>The partial least square (PLS) algorithm retains the combinations of predictors that maximize the covariance with the outcome. Cook et al. (2013) showed that PLS results in a predictor envelope, which is the smallest reducing subspace of predictors’ covariance that contains the coefficient. However, PLS and predictor envelope both target at a space that contains the regression coefficients and therefore they may sometimes be too conservative to reduce the dimension of the predictors. In this paper, we propose a new method that may improve the estimation efficiency of regression coefficients when both PLS and predictor envelope fail to do so. Specifically, our method results in the largest reducing subspace of predictors’ covariance that is contained in the coefficient matrix space. Interestingly, the moment based algorithm of our proposed method can be achieved by changing the max in PLS to min. We define the modified PLS as the inner PLS and the resulting space as the inner predictor envelope space. We provide the theoretical properties of our proposed methods as well as demonstrate their use in China Health and Nutrition Survey.</p></div>\",\"PeriodicalId\":16431,\"journal\":{\"name\":\"Journal of Multivariate Analysis\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.4000,\"publicationDate\":\"2024-08-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Multivariate Analysis\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0047259X24000630\",\"RegionNum\":3,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"STATISTICS & PROBABILITY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Multivariate Analysis","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0047259X24000630","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
The inner partial least square: An exploration of the “necessary” dimension reduction
The partial least square (PLS) algorithm retains the combinations of predictors that maximize the covariance with the outcome. Cook et al. (2013) showed that PLS results in a predictor envelope, which is the smallest reducing subspace of predictors’ covariance that contains the coefficient. However, PLS and predictor envelope both target at a space that contains the regression coefficients and therefore they may sometimes be too conservative to reduce the dimension of the predictors. In this paper, we propose a new method that may improve the estimation efficiency of regression coefficients when both PLS and predictor envelope fail to do so. Specifically, our method results in the largest reducing subspace of predictors’ covariance that is contained in the coefficient matrix space. Interestingly, the moment based algorithm of our proposed method can be achieved by changing the max in PLS to min. We define the modified PLS as the inner PLS and the resulting space as the inner predictor envelope space. We provide the theoretical properties of our proposed methods as well as demonstrate their use in China Health and Nutrition Survey.
期刊介绍:
Founded in 1971, the Journal of Multivariate Analysis (JMVA) is the central venue for the publication of new, relevant methodology and particularly innovative applications pertaining to the analysis and interpretation of multidimensional data.
The journal welcomes contributions to all aspects of multivariate data analysis and modeling, including cluster analysis, discriminant analysis, factor analysis, and multidimensional continuous or discrete distribution theory. Topics of current interest include, but are not limited to, inferential aspects of
Copula modeling
Functional data analysis
Graphical modeling
High-dimensional data analysis
Image analysis
Multivariate extreme-value theory
Sparse modeling
Spatial statistics.