基于白鲸优化的多目标特征选择算法

IF 3.7 2区 化学 Q2 AUTOMATION & CONTROL SYSTEMS
Kiana Kouhpah Esfahani, Behnam Mohammad Hasani Zade, Najme Mansouri
{"title":"基于白鲸优化的多目标特征选择算法","authors":"Kiana Kouhpah Esfahani,&nbsp;Behnam Mohammad Hasani Zade,&nbsp;Najme Mansouri","doi":"10.1016/j.chemolab.2024.105295","DOIUrl":null,"url":null,"abstract":"<div><div>The advancement of science and technology has resulted in large datasets with noisy or redundant features that hamper classification. In feature selection, relevant attributes are selected to reduce dimensionality, thereby improving classification accuracy. Multi-objective optimization is crucial in feature selection because it allows simultaneous evaluation of multiple, often conflicting objectives, such as maximizing model accuracy and minimizing the number of features. Traditional single-objective methods might focus solely on accuracy, often leading to models that are complex and computationally expensive. Multi-objective optimization, on the other hand, considers trade-offs between different criteria, identifying a set of optimal solutions (a Pareto front) where no one solution is clearly superior. It is especially useful when analyzing high-dimensional datasets, as it reduces overfitting and enhances model performance by selecting the most informative subset of features. This article introduces and evaluates the performance of the Binary version of Beluga Whale Optimization and the Multi-Objective Beluga Whale Optimization (MOBWO) algorithm in the context of feature selection. Features are encoded as binary matrices to denote their presence or absence, making it easier to stratify datasets. MOBWO emulates the exploration and exploitation patterns of Beluga Whale Optimization (BWO) through continuous search space. Optimal classification accuracy and minimum feature subset size are two conflicting objectives. The MOBWO was compared using 12 datasets from the University of California Irvine (UCI) repository with eleven well-known optimization algorithms, such as Genetic Algorithm (GA), Sine Cosine Algorithm (SCA), Bat Optimization Algorithm (BOA), Differential Evolution (DE), Whale Optimization Algorithm (WOA), Non-dominated Sorting Genetic Algorithm II (NSGA-II), Multi-Objective Particle Swarm Optimization (MOPSO), Multi-Objective Grey Wolf Optimizer (MOGWO), Multi-Objective Grasshopper Optimization Algorithm (MOGOA), Multi-Objective Non-dominated advanced Butterfly Optimization Algorithm (MONSBOA), and Multi-Objective Slime Mould Algorithm (MOSMA). In experiments using Random Forest (RF) as the classifier, different performance metrics were evaluated. The computational results show that the proposed BBWO algorithm achieves an average accuracy rate of 99.06 % across 12 datasets. Additionally, the proposed MOBWO algorithm outperforms existing multi-objective feature selection methods on all 12 datasets based on three metrics: Success Counting (SCC), Inverted Generational Distance (IGD), and Hypervolume indicators (HV). For instance, MOBWO achieves an average HV that is at least 3.54 % higher than all other methods.</div></div>","PeriodicalId":9774,"journal":{"name":"Chemometrics and Intelligent Laboratory Systems","volume":"257 ","pages":"Article 105295"},"PeriodicalIF":3.7000,"publicationDate":"2024-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Multi-objective feature selection algorithm using Beluga Whale Optimization\",\"authors\":\"Kiana Kouhpah Esfahani,&nbsp;Behnam Mohammad Hasani Zade,&nbsp;Najme Mansouri\",\"doi\":\"10.1016/j.chemolab.2024.105295\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>The advancement of science and technology has resulted in large datasets with noisy or redundant features that hamper classification. In feature selection, relevant attributes are selected to reduce dimensionality, thereby improving classification accuracy. Multi-objective optimization is crucial in feature selection because it allows simultaneous evaluation of multiple, often conflicting objectives, such as maximizing model accuracy and minimizing the number of features. Traditional single-objective methods might focus solely on accuracy, often leading to models that are complex and computationally expensive. Multi-objective optimization, on the other hand, considers trade-offs between different criteria, identifying a set of optimal solutions (a Pareto front) where no one solution is clearly superior. It is especially useful when analyzing high-dimensional datasets, as it reduces overfitting and enhances model performance by selecting the most informative subset of features. This article introduces and evaluates the performance of the Binary version of Beluga Whale Optimization and the Multi-Objective Beluga Whale Optimization (MOBWO) algorithm in the context of feature selection. Features are encoded as binary matrices to denote their presence or absence, making it easier to stratify datasets. MOBWO emulates the exploration and exploitation patterns of Beluga Whale Optimization (BWO) through continuous search space. Optimal classification accuracy and minimum feature subset size are two conflicting objectives. The MOBWO was compared using 12 datasets from the University of California Irvine (UCI) repository with eleven well-known optimization algorithms, such as Genetic Algorithm (GA), Sine Cosine Algorithm (SCA), Bat Optimization Algorithm (BOA), Differential Evolution (DE), Whale Optimization Algorithm (WOA), Non-dominated Sorting Genetic Algorithm II (NSGA-II), Multi-Objective Particle Swarm Optimization (MOPSO), Multi-Objective Grey Wolf Optimizer (MOGWO), Multi-Objective Grasshopper Optimization Algorithm (MOGOA), Multi-Objective Non-dominated advanced Butterfly Optimization Algorithm (MONSBOA), and Multi-Objective Slime Mould Algorithm (MOSMA). In experiments using Random Forest (RF) as the classifier, different performance metrics were evaluated. The computational results show that the proposed BBWO algorithm achieves an average accuracy rate of 99.06 % across 12 datasets. Additionally, the proposed MOBWO algorithm outperforms existing multi-objective feature selection methods on all 12 datasets based on three metrics: Success Counting (SCC), Inverted Generational Distance (IGD), and Hypervolume indicators (HV). For instance, MOBWO achieves an average HV that is at least 3.54 % higher than all other methods.</div></div>\",\"PeriodicalId\":9774,\"journal\":{\"name\":\"Chemometrics and Intelligent Laboratory Systems\",\"volume\":\"257 \",\"pages\":\"Article 105295\"},\"PeriodicalIF\":3.7000,\"publicationDate\":\"2024-12-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Chemometrics and Intelligent Laboratory Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0169743924002351\",\"RegionNum\":2,\"RegionCategory\":\"化学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"AUTOMATION & CONTROL SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Chemometrics and Intelligent Laboratory Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0169743924002351","RegionNum":2,"RegionCategory":"化学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0

摘要

科学技术的进步导致了具有噪声或冗余特征的大型数据集,妨碍了分类。在特征选择中,选择相关属性降维,从而提高分类精度。多目标优化在特征选择中是至关重要的,因为它允许同时评估多个经常相互冲突的目标,例如最大化模型准确性和最小化特征数量。传统的单目标方法可能只关注准确性,往往导致模型复杂且计算成本高。另一方面,多目标优化考虑不同标准之间的权衡,确定一组最优解决方案(帕累托前线),其中没有一个解决方案明显优于其他解决方案。它在分析高维数据集时特别有用,因为它通过选择最具信息量的特征子集来减少过拟合并增强模型性能。本文介绍并评价了二进制版白鲸优化算法和多目标白鲸优化算法(MOBWO)在特征选择方面的性能。特征被编码为二进制矩阵来表示它们的存在或不存在,使数据集更容易分层。MOBWO模拟了白鲸优化(Beluga Whale Optimization, BWO)的探索和开发模式,通过持续的搜索空间。最佳分类精度和最小特征子集大小是两个相互冲突的目标。利用加州大学欧文分校(UCI)数据库中的12个数据集,将MOBWO算法与遗传算法(GA)、正弦余弦算法(SCA)、蝙蝠优化算法(BOA)、差分进化算法(DE)、鲸鱼优化算法(WOA)、非支配排序遗传算法II (NSGA-II)、多目标粒子群算法(MOPSO)、多目标灰狼优化器(MOGWO)、多目标蚱蜢优化算法(MOGOA)、多目标非支配高级蝴蝶优化算法(MONSBOA)和多目标黏菌算法(MOSMA)。在使用随机森林(RF)作为分类器的实验中,评估了不同的性能指标。计算结果表明,提出的BBWO算法在12个数据集上的平均准确率达到99.06%。此外,基于成功计数(SCC)、倒代距离(IGD)和Hypervolume指标(HV)这三个指标,MOBWO算法在所有12个数据集上都优于现有的多目标特征选择方法。例如,MOBWO实现的平均HV比所有其他方法至少高出3.54%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Multi-objective feature selection algorithm using Beluga Whale Optimization
The advancement of science and technology has resulted in large datasets with noisy or redundant features that hamper classification. In feature selection, relevant attributes are selected to reduce dimensionality, thereby improving classification accuracy. Multi-objective optimization is crucial in feature selection because it allows simultaneous evaluation of multiple, often conflicting objectives, such as maximizing model accuracy and minimizing the number of features. Traditional single-objective methods might focus solely on accuracy, often leading to models that are complex and computationally expensive. Multi-objective optimization, on the other hand, considers trade-offs between different criteria, identifying a set of optimal solutions (a Pareto front) where no one solution is clearly superior. It is especially useful when analyzing high-dimensional datasets, as it reduces overfitting and enhances model performance by selecting the most informative subset of features. This article introduces and evaluates the performance of the Binary version of Beluga Whale Optimization and the Multi-Objective Beluga Whale Optimization (MOBWO) algorithm in the context of feature selection. Features are encoded as binary matrices to denote their presence or absence, making it easier to stratify datasets. MOBWO emulates the exploration and exploitation patterns of Beluga Whale Optimization (BWO) through continuous search space. Optimal classification accuracy and minimum feature subset size are two conflicting objectives. The MOBWO was compared using 12 datasets from the University of California Irvine (UCI) repository with eleven well-known optimization algorithms, such as Genetic Algorithm (GA), Sine Cosine Algorithm (SCA), Bat Optimization Algorithm (BOA), Differential Evolution (DE), Whale Optimization Algorithm (WOA), Non-dominated Sorting Genetic Algorithm II (NSGA-II), Multi-Objective Particle Swarm Optimization (MOPSO), Multi-Objective Grey Wolf Optimizer (MOGWO), Multi-Objective Grasshopper Optimization Algorithm (MOGOA), Multi-Objective Non-dominated advanced Butterfly Optimization Algorithm (MONSBOA), and Multi-Objective Slime Mould Algorithm (MOSMA). In experiments using Random Forest (RF) as the classifier, different performance metrics were evaluated. The computational results show that the proposed BBWO algorithm achieves an average accuracy rate of 99.06 % across 12 datasets. Additionally, the proposed MOBWO algorithm outperforms existing multi-objective feature selection methods on all 12 datasets based on three metrics: Success Counting (SCC), Inverted Generational Distance (IGD), and Hypervolume indicators (HV). For instance, MOBWO achieves an average HV that is at least 3.54 % higher than all other methods.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
7.50
自引率
7.70%
发文量
169
审稿时长
3.4 months
期刊介绍: Chemometrics and Intelligent Laboratory Systems publishes original research papers, short communications, reviews, tutorials and Original Software Publications reporting on development of novel statistical, mathematical, or computer techniques in Chemistry and related disciplines. Chemometrics is the chemical discipline that uses mathematical and statistical methods to design or select optimal procedures and experiments, and to provide maximum chemical information by analysing chemical data. The journal deals with the following topics: 1) Development of new statistical, mathematical and chemometrical methods for Chemistry and related fields (Environmental Chemistry, Biochemistry, Toxicology, System Biology, -Omics, etc.) 2) Novel applications of chemometrics to all branches of Chemistry and related fields (typical domains of interest are: process data analysis, experimental design, data mining, signal processing, supervised modelling, decision making, robust statistics, mixture analysis, multivariate calibration etc.) Routine applications of established chemometrical techniques will not be considered. 3) Development of new software that provides novel tools or truly advances the use of chemometrical methods. 4) Well characterized data sets to test performance for the new methods and software. The journal complies with International Committee of Medical Journal Editors'' Uniform requirements for manuscripts.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信