{"title":"基于复合最小化的大协方差矩阵估计","authors":"M. Farné","doi":"10.6092/UNIBO/AMSDOTTORATO/7250","DOIUrl":null,"url":null,"abstract":"A method to regularize large-dimensional covariance matrices under the assumption of approximate factor model will be presented. Existing methods perform estimation by recovering principal components and sparsifying the residual covariance matrix. In our setting this task is achieved recovering the low rank plus sparse decomposition by least squares minimization under nuclear norm plus $l_1$ norm penalization. In the literature, the best known algorithm to solve this problem is soft thresholding plus singular value thresholding and consistency of estimators is derived under specific assumptions on the eigenvalues of the low rank component matrix. In this paper consistency of the proposed estimator will be derived under the pervasive condition, providing the identification of low rank and sparse spaces by introducing the unshrinking of estimated eigenvalues. Algorithm derivation and convergence analysis are provided, and the new procedure is compared with the existing ones under the same assumptions. The performance of our minimizer is described in a wide simulation study, where various low rank plus sparse settings are simulated according to different parameter values.","PeriodicalId":330529,"journal":{"name":"International Federation of Classification Societies","volume":"52 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-04-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Large Covariance Matrix Estimation by Composite Minimization\",\"authors\":\"M. Farné\",\"doi\":\"10.6092/UNIBO/AMSDOTTORATO/7250\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A method to regularize large-dimensional covariance matrices under the assumption of approximate factor model will be presented. Existing methods perform estimation by recovering principal components and sparsifying the residual covariance matrix. In our setting this task is achieved recovering the low rank plus sparse decomposition by least squares minimization under nuclear norm plus $l_1$ norm penalization. In the literature, the best known algorithm to solve this problem is soft thresholding plus singular value thresholding and consistency of estimators is derived under specific assumptions on the eigenvalues of the low rank component matrix. In this paper consistency of the proposed estimator will be derived under the pervasive condition, providing the identification of low rank and sparse spaces by introducing the unshrinking of estimated eigenvalues. Algorithm derivation and convergence analysis are provided, and the new procedure is compared with the existing ones under the same assumptions. The performance of our minimizer is described in a wide simulation study, where various low rank plus sparse settings are simulated according to different parameter values.\",\"PeriodicalId\":330529,\"journal\":{\"name\":\"International Federation of Classification Societies\",\"volume\":\"52 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-04-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Federation of Classification Societies\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.6092/UNIBO/AMSDOTTORATO/7250\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Federation of Classification Societies","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.6092/UNIBO/AMSDOTTORATO/7250","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Large Covariance Matrix Estimation by Composite Minimization
A method to regularize large-dimensional covariance matrices under the assumption of approximate factor model will be presented. Existing methods perform estimation by recovering principal components and sparsifying the residual covariance matrix. In our setting this task is achieved recovering the low rank plus sparse decomposition by least squares minimization under nuclear norm plus $l_1$ norm penalization. In the literature, the best known algorithm to solve this problem is soft thresholding plus singular value thresholding and consistency of estimators is derived under specific assumptions on the eigenvalues of the low rank component matrix. In this paper consistency of the proposed estimator will be derived under the pervasive condition, providing the identification of low rank and sparse spaces by introducing the unshrinking of estimated eigenvalues. Algorithm derivation and convergence analysis are provided, and the new procedure is compared with the existing ones under the same assumptions. The performance of our minimizer is described in a wide simulation study, where various low rank plus sparse settings are simulated according to different parameter values.