{"title":"Selecting Kernel Eigenfaces for Face Recognition with One Training Sample Per Subject","authors":"Jie Wang, K. Plataniotis, A. Venetsanopoulos","doi":"10.1109/ICME.2006.262861","DOIUrl":null,"url":null,"abstract":"It is well-known that supervised learning techniques such as linear discriminant analysis (LDA) often suffer from the so called small sample size problem when apply to solve face recognition problems. This is due to the fact that in most cases, the number of training samples is much smaller than the dimensionality of the sample space. The problem becomes even more severe if only one training sample is available for each subject. In this paper, followed by the well-known unsupervised technique, kernel principal component analysis (KPCA), a novel feature selection scheme is proposed to establish a discriminant feature subspace in which the class separability is maximized. Extensive experiments performed on the FERET database indicate that the proposed scheme significantly boosts the recognition performance of the traditional KPCA solution","PeriodicalId":339258,"journal":{"name":"2006 IEEE International Conference on Multimedia and Expo","volume":"22 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2006-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2006 IEEE International Conference on Multimedia and Expo","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICME.2006.262861","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
It is well-known that supervised learning techniques such as linear discriminant analysis (LDA) often suffer from the so called small sample size problem when apply to solve face recognition problems. This is due to the fact that in most cases, the number of training samples is much smaller than the dimensionality of the sample space. The problem becomes even more severe if only one training sample is available for each subject. In this paper, followed by the well-known unsupervised technique, kernel principal component analysis (KPCA), a novel feature selection scheme is proposed to establish a discriminant feature subspace in which the class separability is maximized. Extensive experiments performed on the FERET database indicate that the proposed scheme significantly boosts the recognition performance of the traditional KPCA solution