A novel hybrid Aquila optimizer with Whale optimziation algorithm for global optimization, feature selection and optimizing SVM parameters

Rabab Bousmaha
{"title":"A novel hybrid Aquila optimizer with Whale optimziation algorithm for global optimization, feature selection and optimizing SVM parameters","authors":"Rabab Bousmaha","doi":"10.1109/PAIS56586.2022.9946891","DOIUrl":null,"url":null,"abstract":"“Support vector machine (SVM)” is one of the well-known machine learning algorithms used for classification and regression tasks. The “SVM” parameters and features selection have a significant impact on the “SVM” model's complexity and classification accuracy. Finding the best set of features and selecting the appropriate parameters of “SVM” is considered an optimization problem. Different metaheuristic algorithms are employed in the literature to choose the best set of features and to optimize “SVM” parameters simultaneously. This paper presents a new algorithm hybrid based on “Aquila Optimizer” and “Whale Optimization Algorithm” with Adaptive inertial weight, called AOWOA. The AOWOA is used for global optimization and feature selection and optimizing “SVM” parameters to achieve a higher classification accuracy. We utilized two experiment series to test the presented (AOWOA) algorithm. In the first experiment, 13 standard “benchmark functions” are used and the AOWOA algorithm is compared to AO, MVO, WOA, MFO, PSO, and SSA. In the second experiment, the results of AOWOA-SVM are compared with four metaheuristic algorithms: MVO, GWO, WOA, and BAT using ten labeled datasets. The experiment results proved that AOWOA can reduce the number of features, find the optimal parameters of “SVM”, and avoids local optima with high classification accuracy in most datasets compared to other algorithms.","PeriodicalId":266229,"journal":{"name":"2022 4th International Conference on Pattern Analysis and Intelligent Systems (PAIS)","volume":"56 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 4th International Conference on Pattern Analysis and Intelligent Systems (PAIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PAIS56586.2022.9946891","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

“Support vector machine (SVM)” is one of the well-known machine learning algorithms used for classification and regression tasks. The “SVM” parameters and features selection have a significant impact on the “SVM” model's complexity and classification accuracy. Finding the best set of features and selecting the appropriate parameters of “SVM” is considered an optimization problem. Different metaheuristic algorithms are employed in the literature to choose the best set of features and to optimize “SVM” parameters simultaneously. This paper presents a new algorithm hybrid based on “Aquila Optimizer” and “Whale Optimization Algorithm” with Adaptive inertial weight, called AOWOA. The AOWOA is used for global optimization and feature selection and optimizing “SVM” parameters to achieve a higher classification accuracy. We utilized two experiment series to test the presented (AOWOA) algorithm. In the first experiment, 13 standard “benchmark functions” are used and the AOWOA algorithm is compared to AO, MVO, WOA, MFO, PSO, and SSA. In the second experiment, the results of AOWOA-SVM are compared with four metaheuristic algorithms: MVO, GWO, WOA, and BAT using ten labeled datasets. The experiment results proved that AOWOA can reduce the number of features, find the optimal parameters of “SVM”, and avoids local optima with high classification accuracy in most datasets compared to other algorithms.
基于Whale优化算法的Aquila混合优化器用于全局优化、特征选择和支持向量机参数优化
“支持向量机(SVM)”是用于分类和回归任务的著名机器学习算法之一。“支持向量机”参数和特征的选择对“支持向量机”模型的复杂度和分类精度有重要影响。寻找最佳特征集并选择合适的支持向量机参数被认为是一个优化问题。文献中采用不同的元启发式算法来选择最佳特征集并同时优化“支持向量机”参数。本文提出了一种基于自适应惯性权的“Aquila优化器”和“Whale优化算法”的混合算法AOWOA。AOWOA用于全局优化和特征选择,优化“SVM”参数,以达到更高的分类精度。我们使用两个实验序列来测试所提出的(AOWOA)算法。在第一个实验中,使用了13个标准的“基准函数”,并将AOWOA算法与AO、MVO、WOA、MFO、PSO和SSA进行了比较。在第二个实验中,使用10个标记数据集,将AOWOA-SVM与MVO、GWO、WOA和BAT四种元启发式算法的结果进行比较。实验结果证明,与其他算法相比,AOWOA可以减少特征数量,找到“支持向量机”的最优参数,在大多数数据集上避免了局部最优,分类精度较高。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信