基于改进的白鲨优化器的特征选择

IF 4.9 3区 计算机科学 Q1 ENGINEERING, MULTIDISCIPLINARY
Qianqian Cui, Shijie Zhao, Miao Chen, Qiuli Zhao
{"title":"基于改进的白鲨优化器的特征选择","authors":"Qianqian Cui,&nbsp;Shijie Zhao,&nbsp;Miao Chen,&nbsp;Qiuli Zhao","doi":"10.1007/s42235-024-00580-w","DOIUrl":null,"url":null,"abstract":"<div><p>Feature Selection (FS) is an optimization problem that aims to downscale and improve the quality of a dataset by retaining relevant features while excluding redundant ones. It enhances the classification accuracy of a dataset and holds a crucial position in the field of data mining. Utilizing metaheuristic algorithms for selecting feature subsets contributes to optimizing the FS problem. The White Shark Optimizer (WSO), as a metaheuristic algorithm, primarily simulates the behavior of great white sharks’ sense of hearing and smelling during swimming and hunting. However, it fails to consider their other randomly occurring behaviors, for example, Tail Slapping and Clustered Together behaviors. The Tail Slapping behavior can increase population diversity and improve the global search performance of the algorithm. The Clustered Together behavior includes access to food and mating, which can change the direction of local search and enhance local utilization. It incorporates Tail Slapping and Clustered Together behavior into the original algorithm to propose an Improved White Shark Optimizer (IWSO). The two behaviors and the presented IWSO are tested separately using the CEC2017 benchmark functions, and the test results of IWSO are compared with other metaheuristic algorithms, which proves that IWSO combining the two behaviors has a stronger search capability. Feature selection can be mathematically described as a weighted combination of feature subset size and classification error rate as an optimization model, which is iteratively optimized using discretized IWSO which combines with K-Nearest Neighbor (KNN) on 16 benchmark datasets and the results are compared with 7 metaheuristics. Experimental results show that the IWSO is more capable in selecting feature subsets and improving classification accuracy.</p></div>","PeriodicalId":614,"journal":{"name":"Journal of Bionic Engineering","volume":"21 6","pages":"3123 - 3150"},"PeriodicalIF":4.9000,"publicationDate":"2024-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Feature Selection Based on Improved White Shark Optimizer\",\"authors\":\"Qianqian Cui,&nbsp;Shijie Zhao,&nbsp;Miao Chen,&nbsp;Qiuli Zhao\",\"doi\":\"10.1007/s42235-024-00580-w\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Feature Selection (FS) is an optimization problem that aims to downscale and improve the quality of a dataset by retaining relevant features while excluding redundant ones. It enhances the classification accuracy of a dataset and holds a crucial position in the field of data mining. Utilizing metaheuristic algorithms for selecting feature subsets contributes to optimizing the FS problem. The White Shark Optimizer (WSO), as a metaheuristic algorithm, primarily simulates the behavior of great white sharks’ sense of hearing and smelling during swimming and hunting. However, it fails to consider their other randomly occurring behaviors, for example, Tail Slapping and Clustered Together behaviors. The Tail Slapping behavior can increase population diversity and improve the global search performance of the algorithm. The Clustered Together behavior includes access to food and mating, which can change the direction of local search and enhance local utilization. It incorporates Tail Slapping and Clustered Together behavior into the original algorithm to propose an Improved White Shark Optimizer (IWSO). The two behaviors and the presented IWSO are tested separately using the CEC2017 benchmark functions, and the test results of IWSO are compared with other metaheuristic algorithms, which proves that IWSO combining the two behaviors has a stronger search capability. Feature selection can be mathematically described as a weighted combination of feature subset size and classification error rate as an optimization model, which is iteratively optimized using discretized IWSO which combines with K-Nearest Neighbor (KNN) on 16 benchmark datasets and the results are compared with 7 metaheuristics. Experimental results show that the IWSO is more capable in selecting feature subsets and improving classification accuracy.</p></div>\",\"PeriodicalId\":614,\"journal\":{\"name\":\"Journal of Bionic Engineering\",\"volume\":\"21 6\",\"pages\":\"3123 - 3150\"},\"PeriodicalIF\":4.9000,\"publicationDate\":\"2024-09-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Bionic Engineering\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://link.springer.com/article/10.1007/s42235-024-00580-w\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Bionic Engineering","FirstCategoryId":"94","ListUrlMain":"https://link.springer.com/article/10.1007/s42235-024-00580-w","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

摘要

特征选择(Feature Selection,FS)是一个优化问题,其目的是通过保留相关特征并剔除冗余特征来缩小数据集的规模并提高数据集的质量。它能提高数据集的分类准确性,在数据挖掘领域占有重要地位。利用元启发式算法选择特征子集有助于优化 FS 问题。白鲨优化器(WSO)作为一种元启发式算法,主要模拟大白鲨在游泳和捕食过程中的听觉和嗅觉行为。但是,它没有考虑大白鲨的其他随机行为,例如拍尾行为和聚群行为。拍尾行为可以增加种群多样性,提高算法的全局搜索性能。聚群行为包括获取食物和交配,可以改变局部搜索的方向,提高局部利用率。本研究将拍尾行为和聚群行为融入原始算法,提出了改进的白鲨优化算法(IWSO)。利用 CEC2017 基准函数对这两种行为和提出的 IWSO 分别进行了测试,并将 IWSO 的测试结果与其他元启发式算法进行了比较,证明结合了这两种行为的 IWSO 具有更强的搜索能力。特征选择在数学上可以描述为特征子集大小和分类错误率的加权组合优化模型,在 16 个基准数据集上使用离散化 IWSO 与 K-Nearest Neighbor(KNN)相结合进行迭代优化,并将结果与 7 种元启发式算法进行比较。实验结果表明,IWSO 在选择特征子集和提高分类准确率方面能力更强。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Feature Selection Based on Improved White Shark Optimizer

Feature Selection Based on Improved White Shark Optimizer

Feature Selection (FS) is an optimization problem that aims to downscale and improve the quality of a dataset by retaining relevant features while excluding redundant ones. It enhances the classification accuracy of a dataset and holds a crucial position in the field of data mining. Utilizing metaheuristic algorithms for selecting feature subsets contributes to optimizing the FS problem. The White Shark Optimizer (WSO), as a metaheuristic algorithm, primarily simulates the behavior of great white sharks’ sense of hearing and smelling during swimming and hunting. However, it fails to consider their other randomly occurring behaviors, for example, Tail Slapping and Clustered Together behaviors. The Tail Slapping behavior can increase population diversity and improve the global search performance of the algorithm. The Clustered Together behavior includes access to food and mating, which can change the direction of local search and enhance local utilization. It incorporates Tail Slapping and Clustered Together behavior into the original algorithm to propose an Improved White Shark Optimizer (IWSO). The two behaviors and the presented IWSO are tested separately using the CEC2017 benchmark functions, and the test results of IWSO are compared with other metaheuristic algorithms, which proves that IWSO combining the two behaviors has a stronger search capability. Feature selection can be mathematically described as a weighted combination of feature subset size and classification error rate as an optimization model, which is iteratively optimized using discretized IWSO which combines with K-Nearest Neighbor (KNN) on 16 benchmark datasets and the results are compared with 7 metaheuristics. Experimental results show that the IWSO is more capable in selecting feature subsets and improving classification accuracy.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of Bionic Engineering
Journal of Bionic Engineering 工程技术-材料科学:生物材料
CiteScore
7.10
自引率
10.00%
发文量
162
审稿时长
10.0 months
期刊介绍: The Journal of Bionic Engineering (JBE) is a peer-reviewed journal that publishes original research papers and reviews that apply the knowledge learned from nature and biological systems to solve concrete engineering problems. The topics that JBE covers include but are not limited to: Mechanisms, kinematical mechanics and control of animal locomotion, development of mobile robots with walking (running and crawling), swimming or flying abilities inspired by animal locomotion. Structures, morphologies, composition and physical properties of natural and biomaterials; fabrication of new materials mimicking the properties and functions of natural and biomaterials. Biomedical materials, artificial organs and tissue engineering for medical applications; rehabilitation equipment and devices. Development of bioinspired computation methods and artificial intelligence for engineering applications.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信