Kiana Kouhpah Esfahani, Behnam Mohammad Hasani Zade, Najme Mansouri
{"title":"Multi-objective feature selection algorithm using Beluga Whale Optimization","authors":"Kiana Kouhpah Esfahani, Behnam Mohammad Hasani Zade, Najme Mansouri","doi":"10.1016/j.chemolab.2024.105295","DOIUrl":null,"url":null,"abstract":"<div><div>The advancement of science and technology has resulted in large datasets with noisy or redundant features that hamper classification. In feature selection, relevant attributes are selected to reduce dimensionality, thereby improving classification accuracy. Multi-objective optimization is crucial in feature selection because it allows simultaneous evaluation of multiple, often conflicting objectives, such as maximizing model accuracy and minimizing the number of features. Traditional single-objective methods might focus solely on accuracy, often leading to models that are complex and computationally expensive. Multi-objective optimization, on the other hand, considers trade-offs between different criteria, identifying a set of optimal solutions (a Pareto front) where no one solution is clearly superior. It is especially useful when analyzing high-dimensional datasets, as it reduces overfitting and enhances model performance by selecting the most informative subset of features. This article introduces and evaluates the performance of the Binary version of Beluga Whale Optimization and the Multi-Objective Beluga Whale Optimization (MOBWO) algorithm in the context of feature selection. Features are encoded as binary matrices to denote their presence or absence, making it easier to stratify datasets. MOBWO emulates the exploration and exploitation patterns of Beluga Whale Optimization (BWO) through continuous search space. Optimal classification accuracy and minimum feature subset size are two conflicting objectives. The MOBWO was compared using 12 datasets from the University of California Irvine (UCI) repository with eleven well-known optimization algorithms, such as Genetic Algorithm (GA), Sine Cosine Algorithm (SCA), Bat Optimization Algorithm (BOA), Differential Evolution (DE), Whale Optimization Algorithm (WOA), Non-dominated Sorting Genetic Algorithm II (NSGA-II), Multi-Objective Particle Swarm Optimization (MOPSO), Multi-Objective Grey Wolf Optimizer (MOGWO), Multi-Objective Grasshopper Optimization Algorithm (MOGOA), Multi-Objective Non-dominated advanced Butterfly Optimization Algorithm (MONSBOA), and Multi-Objective Slime Mould Algorithm (MOSMA). In experiments using Random Forest (RF) as the classifier, different performance metrics were evaluated. The computational results show that the proposed BBWO algorithm achieves an average accuracy rate of 99.06 % across 12 datasets. Additionally, the proposed MOBWO algorithm outperforms existing multi-objective feature selection methods on all 12 datasets based on three metrics: Success Counting (SCC), Inverted Generational Distance (IGD), and Hypervolume indicators (HV). For instance, MOBWO achieves an average HV that is at least 3.54 % higher than all other methods.</div></div>","PeriodicalId":9774,"journal":{"name":"Chemometrics and Intelligent Laboratory Systems","volume":"257 ","pages":"Article 105295"},"PeriodicalIF":3.7000,"publicationDate":"2024-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Chemometrics and Intelligent Laboratory Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0169743924002351","RegionNum":2,"RegionCategory":"化学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
The advancement of science and technology has resulted in large datasets with noisy or redundant features that hamper classification. In feature selection, relevant attributes are selected to reduce dimensionality, thereby improving classification accuracy. Multi-objective optimization is crucial in feature selection because it allows simultaneous evaluation of multiple, often conflicting objectives, such as maximizing model accuracy and minimizing the number of features. Traditional single-objective methods might focus solely on accuracy, often leading to models that are complex and computationally expensive. Multi-objective optimization, on the other hand, considers trade-offs between different criteria, identifying a set of optimal solutions (a Pareto front) where no one solution is clearly superior. It is especially useful when analyzing high-dimensional datasets, as it reduces overfitting and enhances model performance by selecting the most informative subset of features. This article introduces and evaluates the performance of the Binary version of Beluga Whale Optimization and the Multi-Objective Beluga Whale Optimization (MOBWO) algorithm in the context of feature selection. Features are encoded as binary matrices to denote their presence or absence, making it easier to stratify datasets. MOBWO emulates the exploration and exploitation patterns of Beluga Whale Optimization (BWO) through continuous search space. Optimal classification accuracy and minimum feature subset size are two conflicting objectives. The MOBWO was compared using 12 datasets from the University of California Irvine (UCI) repository with eleven well-known optimization algorithms, such as Genetic Algorithm (GA), Sine Cosine Algorithm (SCA), Bat Optimization Algorithm (BOA), Differential Evolution (DE), Whale Optimization Algorithm (WOA), Non-dominated Sorting Genetic Algorithm II (NSGA-II), Multi-Objective Particle Swarm Optimization (MOPSO), Multi-Objective Grey Wolf Optimizer (MOGWO), Multi-Objective Grasshopper Optimization Algorithm (MOGOA), Multi-Objective Non-dominated advanced Butterfly Optimization Algorithm (MONSBOA), and Multi-Objective Slime Mould Algorithm (MOSMA). In experiments using Random Forest (RF) as the classifier, different performance metrics were evaluated. The computational results show that the proposed BBWO algorithm achieves an average accuracy rate of 99.06 % across 12 datasets. Additionally, the proposed MOBWO algorithm outperforms existing multi-objective feature selection methods on all 12 datasets based on three metrics: Success Counting (SCC), Inverted Generational Distance (IGD), and Hypervolume indicators (HV). For instance, MOBWO achieves an average HV that is at least 3.54 % higher than all other methods.
期刊介绍:
Chemometrics and Intelligent Laboratory Systems publishes original research papers, short communications, reviews, tutorials and Original Software Publications reporting on development of novel statistical, mathematical, or computer techniques in Chemistry and related disciplines.
Chemometrics is the chemical discipline that uses mathematical and statistical methods to design or select optimal procedures and experiments, and to provide maximum chemical information by analysing chemical data.
The journal deals with the following topics:
1) Development of new statistical, mathematical and chemometrical methods for Chemistry and related fields (Environmental Chemistry, Biochemistry, Toxicology, System Biology, -Omics, etc.)
2) Novel applications of chemometrics to all branches of Chemistry and related fields (typical domains of interest are: process data analysis, experimental design, data mining, signal processing, supervised modelling, decision making, robust statistics, mixture analysis, multivariate calibration etc.) Routine applications of established chemometrical techniques will not be considered.
3) Development of new software that provides novel tools or truly advances the use of chemometrical methods.
4) Well characterized data sets to test performance for the new methods and software.
The journal complies with International Committee of Medical Journal Editors'' Uniform requirements for manuscripts.