Computational Cost Reduction in Multi-Objective Feature Selection Using Permutational-Based Differential Evolution

Jesús-Arnulfo Barradas-Palmeros, E. Mezura-Montes, Rafael Rivera-López, H. Acosta-Mesa, Aldo Márquez-Grajales
{"title":"Computational Cost Reduction in Multi-Objective Feature Selection Using Permutational-Based Differential Evolution","authors":"Jesús-Arnulfo Barradas-Palmeros, E. Mezura-Montes, Rafael Rivera-López, H. Acosta-Mesa, Aldo Márquez-Grajales","doi":"10.3390/mca29040056","DOIUrl":null,"url":null,"abstract":"Feature selection is a preprocessing step in machine learning that aims to reduce dimensionality and improve performance. The approaches for feature selection are often classified according to the evaluation of a subset of features as filter, wrapper, and embedded approaches. The high performance of wrapper approaches for feature selection is associated at the same time with the disadvantage of high computational cost. Cost-reduction mechanisms for feature selection have been proposed in the literature, where competitive performance is achieved more efficiently. This work applies the simple and effective resource-saving mechanisms of the fixed and incremental sampling fraction strategies with memory to avoid repeated evaluations in multi-objective permutational-based differential evolution for feature selection. The selected multi-objective approach is an extension of the DE-FSPM algorithm with the selection mechanism of the GDE3 algorithm. The results showed high resource savings, especially in computational time and the number of evaluations required for the search process. Nonetheless, it was also detected that the algorithm’s performance was diminished. Therefore, the results reported in the literature on the effectiveness of the strategies for cost reduction in single-objective feature selection were only partially sustained in multi-objective feature selection.","PeriodicalId":352525,"journal":{"name":"Mathematical and Computational Applications","volume":"62 3","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-07-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Mathematical and Computational Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3390/mca29040056","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Feature selection is a preprocessing step in machine learning that aims to reduce dimensionality and improve performance. The approaches for feature selection are often classified according to the evaluation of a subset of features as filter, wrapper, and embedded approaches. The high performance of wrapper approaches for feature selection is associated at the same time with the disadvantage of high computational cost. Cost-reduction mechanisms for feature selection have been proposed in the literature, where competitive performance is achieved more efficiently. This work applies the simple and effective resource-saving mechanisms of the fixed and incremental sampling fraction strategies with memory to avoid repeated evaluations in multi-objective permutational-based differential evolution for feature selection. The selected multi-objective approach is an extension of the DE-FSPM algorithm with the selection mechanism of the GDE3 algorithm. The results showed high resource savings, especially in computational time and the number of evaluations required for the search process. Nonetheless, it was also detected that the algorithm’s performance was diminished. Therefore, the results reported in the literature on the effectiveness of the strategies for cost reduction in single-objective feature selection were only partially sustained in multi-objective feature selection.
利用基于排列的差分进化降低多目标特征选择的计算成本
特征选择是机器学习中的一个预处理步骤,旨在降低维度和提高性能。特征选择方法通常根据对特征子集的评估分为过滤法、包装法和嵌入法。包裹法的特征选择性能高,但同时也存在计算成本高的缺点。有文献提出了降低特征选择成本的机制,从而更有效地实现具有竞争力的性能。本研究将固定采样分数策略和增量采样分数策略的简单有效的资源节约机制与内存相结合,避免了基于多目标排列的微分进化论在特征选择中的重复评估。所选择的多目标方法是 DE-FSPM 算法与 GDE3 算法选择机制的扩展。结果表明,该方法节省了大量资源,尤其是计算时间和搜索过程所需的评估次数。不过,也发现该算法的性能有所下降。因此,文献中报道的在单目标特征选择中降低成本策略的有效性在多目标特征选择中只能部分维持。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信