Rasoul Lotfi , Davood Shahsavani , Mohammad Arashi
{"title":"使用Ledoit-Wolf收缩方法对椭圆分布的观测进行分类","authors":"Rasoul Lotfi , Davood Shahsavani , Mohammad Arashi","doi":"10.1016/j.jmva.2025.105495","DOIUrl":null,"url":null,"abstract":"<div><div>Classifying observations by the method of linear discriminant analysis deals with two challenges. First, the observations may not follow a Gaussian distribution, Second, the covariance matrix is singular when the number of predictor variables exceeds the number of observations. In this article, we study the classification of high-dimensional elliptically distributed data in the framework of Bayesian approach, while using the Ledoit and Wolf’s shrinkage methodology to overcome the singularity of the covariance matrix. Also, a special case t-distribution is considered and the optimal shrinkage parameter is obtained. Furthermore, we evaluated the performance of the proposed estimators on synthetic and real data. Although the optimal shrinkage parameter does not necessarily provide the minimum test error rate, it can provide a solution to show the superiority of our proposed estimation versus some benchmark method.</div></div>","PeriodicalId":16431,"journal":{"name":"Journal of Multivariate Analysis","volume":"210 ","pages":"Article 105495"},"PeriodicalIF":1.4000,"publicationDate":"2025-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Classifying elliptically distributed observations using the Ledoit–Wolf shrinkage approach\",\"authors\":\"Rasoul Lotfi , Davood Shahsavani , Mohammad Arashi\",\"doi\":\"10.1016/j.jmva.2025.105495\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Classifying observations by the method of linear discriminant analysis deals with two challenges. First, the observations may not follow a Gaussian distribution, Second, the covariance matrix is singular when the number of predictor variables exceeds the number of observations. In this article, we study the classification of high-dimensional elliptically distributed data in the framework of Bayesian approach, while using the Ledoit and Wolf’s shrinkage methodology to overcome the singularity of the covariance matrix. Also, a special case t-distribution is considered and the optimal shrinkage parameter is obtained. Furthermore, we evaluated the performance of the proposed estimators on synthetic and real data. Although the optimal shrinkage parameter does not necessarily provide the minimum test error rate, it can provide a solution to show the superiority of our proposed estimation versus some benchmark method.</div></div>\",\"PeriodicalId\":16431,\"journal\":{\"name\":\"Journal of Multivariate Analysis\",\"volume\":\"210 \",\"pages\":\"Article 105495\"},\"PeriodicalIF\":1.4000,\"publicationDate\":\"2025-08-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Multivariate Analysis\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0047259X25000909\",\"RegionNum\":3,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"STATISTICS & PROBABILITY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Multivariate Analysis","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0047259X25000909","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
Classifying elliptically distributed observations using the Ledoit–Wolf shrinkage approach
Classifying observations by the method of linear discriminant analysis deals with two challenges. First, the observations may not follow a Gaussian distribution, Second, the covariance matrix is singular when the number of predictor variables exceeds the number of observations. In this article, we study the classification of high-dimensional elliptically distributed data in the framework of Bayesian approach, while using the Ledoit and Wolf’s shrinkage methodology to overcome the singularity of the covariance matrix. Also, a special case t-distribution is considered and the optimal shrinkage parameter is obtained. Furthermore, we evaluated the performance of the proposed estimators on synthetic and real data. Although the optimal shrinkage parameter does not necessarily provide the minimum test error rate, it can provide a solution to show the superiority of our proposed estimation versus some benchmark method.
期刊介绍:
Founded in 1971, the Journal of Multivariate Analysis (JMVA) is the central venue for the publication of new, relevant methodology and particularly innovative applications pertaining to the analysis and interpretation of multidimensional data.
The journal welcomes contributions to all aspects of multivariate data analysis and modeling, including cluster analysis, discriminant analysis, factor analysis, and multidimensional continuous or discrete distribution theory. Topics of current interest include, but are not limited to, inferential aspects of
Copula modeling
Functional data analysis
Graphical modeling
High-dimensional data analysis
Image analysis
Multivariate extreme-value theory
Sparse modeling
Spatial statistics.