{"title":"一种新的特征提取方法及其与主成分分析和KPCA的关系","authors":"Deihui Wu","doi":"10.1109/CCPR.2008.19","DOIUrl":null,"url":null,"abstract":"A new feature extraction method for high dimensional data using least squares support vector regression (LSSVR) is presented. Firstly, the expressions of optimal projection vectors are derived into the same form as that in the LSSVR algorithm by specially extending the feature of training samples. So the optimal projection vectors could be obtained by LSSVR. Then, using the kernel tricks, the data are mapped from the original input space to a high dimensional feature, and nonlinear feature extraction is here realized from linear version. Finally, it is proved that 1) the method presented has the same result as principal component analysis (PCA). 2) This method is more suitable for the higher dimensional input space compared. 3) The nonlinear feature extraction of the method is equivalent to kernel principal component analysis (KPCA).","PeriodicalId":292956,"journal":{"name":"2008 Chinese Conference on Pattern Recognition","volume":"104 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2008-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"A Novel Feature Extraction Method and Its Relationships with PCA and KPCA\",\"authors\":\"Deihui Wu\",\"doi\":\"10.1109/CCPR.2008.19\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A new feature extraction method for high dimensional data using least squares support vector regression (LSSVR) is presented. Firstly, the expressions of optimal projection vectors are derived into the same form as that in the LSSVR algorithm by specially extending the feature of training samples. So the optimal projection vectors could be obtained by LSSVR. Then, using the kernel tricks, the data are mapped from the original input space to a high dimensional feature, and nonlinear feature extraction is here realized from linear version. Finally, it is proved that 1) the method presented has the same result as principal component analysis (PCA). 2) This method is more suitable for the higher dimensional input space compared. 3) The nonlinear feature extraction of the method is equivalent to kernel principal component analysis (KPCA).\",\"PeriodicalId\":292956,\"journal\":{\"name\":\"2008 Chinese Conference on Pattern Recognition\",\"volume\":\"104 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2008-10-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2008 Chinese Conference on Pattern Recognition\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CCPR.2008.19\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2008 Chinese Conference on Pattern Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCPR.2008.19","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A Novel Feature Extraction Method and Its Relationships with PCA and KPCA
A new feature extraction method for high dimensional data using least squares support vector regression (LSSVR) is presented. Firstly, the expressions of optimal projection vectors are derived into the same form as that in the LSSVR algorithm by specially extending the feature of training samples. So the optimal projection vectors could be obtained by LSSVR. Then, using the kernel tricks, the data are mapped from the original input space to a high dimensional feature, and nonlinear feature extraction is here realized from linear version. Finally, it is proved that 1) the method presented has the same result as principal component analysis (PCA). 2) This method is more suitable for the higher dimensional input space compared. 3) The nonlinear feature extraction of the method is equivalent to kernel principal component analysis (KPCA).