{"title":"基于组lasso正则化多核学习的异构特征选择","authors":"Yi-Ren Yeh, Y. Chung, Ting-Chu Lin, Y. Wang","doi":"10.1109/IJCNN.2011.6033554","DOIUrl":null,"url":null,"abstract":"We propose a novel multiple kernel learning (MKL) algorithm with a group lasso regularizer, called group lasso regularized MKL (GL-MKL), for heterogeneous feature selection. We extend the existing MKL algorithm and impose a mixed ℓ1 and ℓ2 norm constraint (known as group lasso) as the regularizer. Our GL-MKL determines the optimal base kernels, including the associated weights and kernel parameters, and results in a compact set of features for comparable or improved recognition performance. The use of our GL-MKL avoids the problem of choosing the proper technique to normalize the feature attributes collected from heterogeneous domains (and thus with different properties and distribution ranges). Our approach does not need to exhaustively search for the entire feature space when performing feature selection like prior sequential-based feature selection methods did, and we do not require any prior knowledge on the optimal size of the feature subset either. Comparisons with existing MKL or sequential-based feature selection methods on a variety of datasets confirm the effectiveness of our method in selecting a compact feature subset for comparable or improved classification performance.","PeriodicalId":415833,"journal":{"name":"The 2011 International Joint Conference on Neural Networks","volume":"52 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-10-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Group lasso regularized multiple kernel learning for heterogeneous feature selection\",\"authors\":\"Yi-Ren Yeh, Y. Chung, Ting-Chu Lin, Y. Wang\",\"doi\":\"10.1109/IJCNN.2011.6033554\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We propose a novel multiple kernel learning (MKL) algorithm with a group lasso regularizer, called group lasso regularized MKL (GL-MKL), for heterogeneous feature selection. We extend the existing MKL algorithm and impose a mixed ℓ1 and ℓ2 norm constraint (known as group lasso) as the regularizer. Our GL-MKL determines the optimal base kernels, including the associated weights and kernel parameters, and results in a compact set of features for comparable or improved recognition performance. The use of our GL-MKL avoids the problem of choosing the proper technique to normalize the feature attributes collected from heterogeneous domains (and thus with different properties and distribution ranges). Our approach does not need to exhaustively search for the entire feature space when performing feature selection like prior sequential-based feature selection methods did, and we do not require any prior knowledge on the optimal size of the feature subset either. Comparisons with existing MKL or sequential-based feature selection methods on a variety of datasets confirm the effectiveness of our method in selecting a compact feature subset for comparable or improved classification performance.\",\"PeriodicalId\":415833,\"journal\":{\"name\":\"The 2011 International Joint Conference on Neural Networks\",\"volume\":\"52 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-10-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The 2011 International Joint Conference on Neural Networks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN.2011.6033554\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The 2011 International Joint Conference on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2011.6033554","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Group lasso regularized multiple kernel learning for heterogeneous feature selection
We propose a novel multiple kernel learning (MKL) algorithm with a group lasso regularizer, called group lasso regularized MKL (GL-MKL), for heterogeneous feature selection. We extend the existing MKL algorithm and impose a mixed ℓ1 and ℓ2 norm constraint (known as group lasso) as the regularizer. Our GL-MKL determines the optimal base kernels, including the associated weights and kernel parameters, and results in a compact set of features for comparable or improved recognition performance. The use of our GL-MKL avoids the problem of choosing the proper technique to normalize the feature attributes collected from heterogeneous domains (and thus with different properties and distribution ranges). Our approach does not need to exhaustively search for the entire feature space when performing feature selection like prior sequential-based feature selection methods did, and we do not require any prior knowledge on the optimal size of the feature subset either. Comparisons with existing MKL or sequential-based feature selection methods on a variety of datasets confirm the effectiveness of our method in selecting a compact feature subset for comparable or improved classification performance.