一种用于数据分类中特征选择的“留一特征”包装方法

Jianguo Liu, Neil Danait, Shawn Hu, S. Sengupta
{"title":"一种用于数据分类中特征选择的“留一特征”包装方法","authors":"Jianguo Liu, Neil Danait, Shawn Hu, S. Sengupta","doi":"10.1109/BMEI.2013.6747021","DOIUrl":null,"url":null,"abstract":"Feature selection has been an active research area in the past decades. The objective of feature selection includes improving prediction accuracy, accelerating classification speed, and gaining better understanding of the features. Feature selection methods are often divided into three categories: filter methods, wrapper methods, and embedded methods. In this paper, we propose a simple leave-one-feature-out wrapper method for feature selection. The main goal is to improve prediction accuracy. A distinctive feature of our method is that the number of cross validation trainings is a user controlled constant multiple of the number of features. The strategy can be applied to any classifiers and the idea is intuitive. Given the wide availability of off-the-shelf machine learning software packages and computing power, the proposed simple method may be particularly attractive to practitioners. Numerical experiments are included to show the simple usage and the effectiveness of the method.","PeriodicalId":163211,"journal":{"name":"2013 6th International Conference on Biomedical Engineering and Informatics","volume":"6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"A leave-one-feature-out wrapper method for feature selection in data classification\",\"authors\":\"Jianguo Liu, Neil Danait, Shawn Hu, S. Sengupta\",\"doi\":\"10.1109/BMEI.2013.6747021\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Feature selection has been an active research area in the past decades. The objective of feature selection includes improving prediction accuracy, accelerating classification speed, and gaining better understanding of the features. Feature selection methods are often divided into three categories: filter methods, wrapper methods, and embedded methods. In this paper, we propose a simple leave-one-feature-out wrapper method for feature selection. The main goal is to improve prediction accuracy. A distinctive feature of our method is that the number of cross validation trainings is a user controlled constant multiple of the number of features. The strategy can be applied to any classifiers and the idea is intuitive. Given the wide availability of off-the-shelf machine learning software packages and computing power, the proposed simple method may be particularly attractive to practitioners. Numerical experiments are included to show the simple usage and the effectiveness of the method.\",\"PeriodicalId\":163211,\"journal\":{\"name\":\"2013 6th International Conference on Biomedical Engineering and Informatics\",\"volume\":\"6 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2013-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2013 6th International Conference on Biomedical Engineering and Informatics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/BMEI.2013.6747021\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 6th International Conference on Biomedical Engineering and Informatics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/BMEI.2013.6747021","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9

摘要

在过去的几十年里,特征选择一直是一个活跃的研究领域。特征选择的目标包括提高预测精度、加快分类速度和更好地理解特征。特征选择方法通常分为三类:过滤方法、包装方法和嵌入方法。在本文中,我们提出了一种简单的省略一个特征的包装方法。主要目标是提高预测的准确性。我们方法的一个显著特征是,交叉验证训练的数量是用户控制的特征数量的常数倍。该策略可以应用于任何分类器,其思想是直观的。鉴于现成的机器学习软件包和计算能力的广泛可用性,所提出的简单方法可能对从业者特别有吸引力。数值实验表明了该方法的简单使用和有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A leave-one-feature-out wrapper method for feature selection in data classification
Feature selection has been an active research area in the past decades. The objective of feature selection includes improving prediction accuracy, accelerating classification speed, and gaining better understanding of the features. Feature selection methods are often divided into three categories: filter methods, wrapper methods, and embedded methods. In this paper, we propose a simple leave-one-feature-out wrapper method for feature selection. The main goal is to improve prediction accuracy. A distinctive feature of our method is that the number of cross validation trainings is a user controlled constant multiple of the number of features. The strategy can be applied to any classifiers and the idea is intuitive. Given the wide availability of off-the-shelf machine learning software packages and computing power, the proposed simple method may be particularly attractive to practitioners. Numerical experiments are included to show the simple usage and the effectiveness of the method.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信