{"title":"增量数据的线性子空间学习算法","authors":"Bin Fang, Jing Chen, Yuanyan Tang","doi":"10.1109/ICWAPR.2009.5207464","DOIUrl":null,"url":null,"abstract":"Incremental learning has attracted increasing attention in the past decade. Since many real tasks are high-dimensional problems, dimensionality reduction is the important step. Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) are two of the most widely used dimensionality reduction algorithms. However, PCA is an unsupervised algorithm. It is known that PCA is not suitable for classification tasks. Generally, LDA outperforms PCA when classification problem is involved. However, the major shortcoming of LDA is that the performance of LDA is degraded when encountering singularity problem. Recently, the modified LDA, Maximum margin criterion (MMC) was proposed to overcome the shortcomings of PCA and LDA. Nevertheless, MMC is not suitable for incremental data. The paper proposes an incremental extension version of MMC, called Incremental Maximum margin criterion (IMMC) to update projection matrix when new observation is coming, without repetitive learning. Since the approximation intermediate eigenvalue decomposition is introduced, it is low in computational complexity.","PeriodicalId":424264,"journal":{"name":"2009 International Conference on Wavelet Analysis and Pattern Recognition","volume":"81 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-07-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"A linear subspace learning algorithm for incremental data\",\"authors\":\"Bin Fang, Jing Chen, Yuanyan Tang\",\"doi\":\"10.1109/ICWAPR.2009.5207464\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Incremental learning has attracted increasing attention in the past decade. Since many real tasks are high-dimensional problems, dimensionality reduction is the important step. Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) are two of the most widely used dimensionality reduction algorithms. However, PCA is an unsupervised algorithm. It is known that PCA is not suitable for classification tasks. Generally, LDA outperforms PCA when classification problem is involved. However, the major shortcoming of LDA is that the performance of LDA is degraded when encountering singularity problem. Recently, the modified LDA, Maximum margin criterion (MMC) was proposed to overcome the shortcomings of PCA and LDA. Nevertheless, MMC is not suitable for incremental data. The paper proposes an incremental extension version of MMC, called Incremental Maximum margin criterion (IMMC) to update projection matrix when new observation is coming, without repetitive learning. Since the approximation intermediate eigenvalue decomposition is introduced, it is low in computational complexity.\",\"PeriodicalId\":424264,\"journal\":{\"name\":\"2009 International Conference on Wavelet Analysis and Pattern Recognition\",\"volume\":\"81 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2009-07-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2009 International Conference on Wavelet Analysis and Pattern Recognition\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICWAPR.2009.5207464\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 International Conference on Wavelet Analysis and Pattern Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICWAPR.2009.5207464","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
摘要
在过去的十年里,渐进式学习吸引了越来越多的关注。由于许多实际任务是高维问题,降维是重要的一步。主成分分析(PCA)和线性判别分析(LDA)是目前应用最广泛的两种降维算法。然而,PCA是一种无监督算法。众所周知,PCA不适用于分类任务。一般来说,当涉及分类问题时,LDA优于PCA。然而,LDA的主要缺点是在遇到奇异性问题时性能会下降。近年来,为了克服PCA和LDA的不足,提出了改进的LDA——最大边际准则(Maximum margin criterion, MMC)。然而,MMC不适合增量数据。本文提出了MMC的增量扩展版本,即增量最大边界准则(incremental Maximum margin criterion, IMMC),用于在新观测值到来时更新投影矩阵,而无需重复学习。由于引入了近似中间特征值分解,计算复杂度较低。
A linear subspace learning algorithm for incremental data
Incremental learning has attracted increasing attention in the past decade. Since many real tasks are high-dimensional problems, dimensionality reduction is the important step. Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) are two of the most widely used dimensionality reduction algorithms. However, PCA is an unsupervised algorithm. It is known that PCA is not suitable for classification tasks. Generally, LDA outperforms PCA when classification problem is involved. However, the major shortcoming of LDA is that the performance of LDA is degraded when encountering singularity problem. Recently, the modified LDA, Maximum margin criterion (MMC) was proposed to overcome the shortcomings of PCA and LDA. Nevertheless, MMC is not suitable for incremental data. The paper proposes an incremental extension version of MMC, called Incremental Maximum margin criterion (IMMC) to update projection matrix when new observation is coming, without repetitive learning. Since the approximation intermediate eigenvalue decomposition is introduced, it is low in computational complexity.