A Novel Forward Filter Feature Selection Algorithm Based on Maximum Dual Interaction and Maximum Feature Relevance(MDIMFR) for Machine Learning

M. Anitha, K. Sherly
{"title":"A Novel Forward Filter Feature Selection Algorithm Based on Maximum Dual Interaction and Maximum Feature Relevance(MDIMFR) for Machine Learning","authors":"M. Anitha, K. Sherly","doi":"10.1109/ICACC-202152719.2021.9708300","DOIUrl":null,"url":null,"abstract":"In the last few decades, Feature selection is one of the most challenging and open problem to researchers. The rapid progress in computational techniques causes the generation and recording of data in huge size. Though there exists various feature ranking methods, the processing of data is still a challenging task due to its computational complexity. The filter method has many advantages over the wrapper method. The filter methods are classifier independent and have better computational efficiency. Here, a subset of features is selected based on a certain goal function. Most of these goal functions employs the principle of information theory. Most of the algorithms in earlier studies addressed two factors, that is, maximization of relevancy and minimization of redundancy without considering the interaction among the features. This paper developed a new forward filter feature selection algorithm based on mutual information known as Maximum Dual Interaction and Maximum Feature Relevance(MDIMFR). This method considers all the three factors: relevance, redundancy, and feature interaction. This method is experimented on three datasets and compares the performance with existing methods. The results show that MDIMFR outperforms the existing competitive feature selection methods of recent studies: mRMR, JMIM and CMIM. MDIMFR also achieves good stability in average classification accuracy for a certain number of features, say k and above. Hence, these k features can be considered as an optimal feature set.","PeriodicalId":198810,"journal":{"name":"2021 International Conference on Advances in Computing and Communications (ICACC)","volume":"70 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 International Conference on Advances in Computing and Communications (ICACC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICACC-202152719.2021.9708300","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

In the last few decades, Feature selection is one of the most challenging and open problem to researchers. The rapid progress in computational techniques causes the generation and recording of data in huge size. Though there exists various feature ranking methods, the processing of data is still a challenging task due to its computational complexity. The filter method has many advantages over the wrapper method. The filter methods are classifier independent and have better computational efficiency. Here, a subset of features is selected based on a certain goal function. Most of these goal functions employs the principle of information theory. Most of the algorithms in earlier studies addressed two factors, that is, maximization of relevancy and minimization of redundancy without considering the interaction among the features. This paper developed a new forward filter feature selection algorithm based on mutual information known as Maximum Dual Interaction and Maximum Feature Relevance(MDIMFR). This method considers all the three factors: relevance, redundancy, and feature interaction. This method is experimented on three datasets and compares the performance with existing methods. The results show that MDIMFR outperforms the existing competitive feature selection methods of recent studies: mRMR, JMIM and CMIM. MDIMFR also achieves good stability in average classification accuracy for a certain number of features, say k and above. Hence, these k features can be considered as an optimal feature set.
基于最大对偶交互和最大特征相关性(MDIMFR)的机器学习前向滤波特征选择算法
在过去的几十年里,特征选择是研究人员面临的最具挑战性和开放性的问题之一。计算技术的飞速发展导致了海量数据的产生和记录。虽然存在多种特征排序方法,但由于计算量大,数据的处理仍然是一项具有挑战性的任务。与包装方法相比,过滤器方法有许多优点。该滤波方法与分类器无关,具有较好的计算效率。在这里,基于某个目标函数选择特征子集。这些目标函数大多采用了信息论的原理。早期的研究中,大多数算法只考虑两个因素,即相关性最大化和冗余最小化,而没有考虑特征之间的相互作用。本文提出了一种新的基于互信息的前向滤波特征选择算法——最大双重交互和最大特征相关性(MDIMFR)。该方法考虑了所有三个因素:相关性、冗余性和特征交互。该方法在三个数据集上进行了实验,并与现有方法进行了性能比较。结果表明,MDIMFR优于mRMR、JMIM和CMIM等现有的竞争性特征选择方法。MDIMFR对于一定数量的特征,比如k及以上的特征,在平均分类精度上也有很好的稳定性。因此,这k个特征可以被认为是一个最优特征集。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信