{"title":"A leave-one-feature-out wrapper method for feature selection in data classification","authors":"Jianguo Liu, Neil Danait, Shawn Hu, S. Sengupta","doi":"10.1109/BMEI.2013.6747021","DOIUrl":null,"url":null,"abstract":"Feature selection has been an active research area in the past decades. The objective of feature selection includes improving prediction accuracy, accelerating classification speed, and gaining better understanding of the features. Feature selection methods are often divided into three categories: filter methods, wrapper methods, and embedded methods. In this paper, we propose a simple leave-one-feature-out wrapper method for feature selection. The main goal is to improve prediction accuracy. A distinctive feature of our method is that the number of cross validation trainings is a user controlled constant multiple of the number of features. The strategy can be applied to any classifiers and the idea is intuitive. Given the wide availability of off-the-shelf machine learning software packages and computing power, the proposed simple method may be particularly attractive to practitioners. Numerical experiments are included to show the simple usage and the effectiveness of the method.","PeriodicalId":163211,"journal":{"name":"2013 6th International Conference on Biomedical Engineering and Informatics","volume":"6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 6th International Conference on Biomedical Engineering and Informatics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/BMEI.2013.6747021","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9
Abstract
Feature selection has been an active research area in the past decades. The objective of feature selection includes improving prediction accuracy, accelerating classification speed, and gaining better understanding of the features. Feature selection methods are often divided into three categories: filter methods, wrapper methods, and embedded methods. In this paper, we propose a simple leave-one-feature-out wrapper method for feature selection. The main goal is to improve prediction accuracy. A distinctive feature of our method is that the number of cross validation trainings is a user controlled constant multiple of the number of features. The strategy can be applied to any classifiers and the idea is intuitive. Given the wide availability of off-the-shelf machine learning software packages and computing power, the proposed simple method may be particularly attractive to practitioners. Numerical experiments are included to show the simple usage and the effectiveness of the method.