Laith Abualigah , Saleh Ali Alomari , Mohammad H. Almomani , Raed Abu Zitar , Hazem Migdady , Kashif Saleem , Aseel Smerat , Vaclav Snasel , Absalom E. Ezugwu
{"title":"An enhanced Walrus Optimizer with opposition-based learning and mutation strategy for data clustering","authors":"Laith Abualigah , Saleh Ali Alomari , Mohammad H. Almomani , Raed Abu Zitar , Hazem Migdady , Kashif Saleem , Aseel Smerat , Vaclav Snasel , Absalom E. Ezugwu","doi":"10.1016/j.array.2025.100409","DOIUrl":null,"url":null,"abstract":"<div><div>Data clustering plays a crucial role in various domains, such as image processing, pattern recognition, and data mining. Traditional clustering techniques often suffer from limitations like sensitivity to initialization, poor convergence, and entrapment in local optima. To address these challenges, this paper proposes an Enhanced Walrus Optimizer (IWO) tailored for clustering tasks. The proposed IWO integrates two powerful strategies–Opposition-Based Learning (OBL) and Mutation Search Strategy (MSS)–to improve population diversity and prevent premature convergence, thereby enhancing both exploration and exploitation capabilities. These enhancements enable more accurate and stable identification of cluster centers. The effectiveness of IWO is validated through extensive experiments on multiple benchmark clustering datasets and compared against several state-of-the-art metaheuristic algorithms, including PSO, GWO, AOA, and others. The results demonstrate that IWO achieves better results, indicating improved compactness and separation of clusters. Statistical validation using p-values and ranking scores further confirms the superiority of the proposed method. These findings suggest that IWO offers a robust and flexible framework for solving complex clustering problems. Future work will explore hybrid deep learning-integrated models and parallel implementations to enhance scalability.</div></div>","PeriodicalId":8417,"journal":{"name":"Array","volume":"26 ","pages":"Article 100409"},"PeriodicalIF":2.3000,"publicationDate":"2025-05-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Array","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2590005625000360","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 0
Abstract
Data clustering plays a crucial role in various domains, such as image processing, pattern recognition, and data mining. Traditional clustering techniques often suffer from limitations like sensitivity to initialization, poor convergence, and entrapment in local optima. To address these challenges, this paper proposes an Enhanced Walrus Optimizer (IWO) tailored for clustering tasks. The proposed IWO integrates two powerful strategies–Opposition-Based Learning (OBL) and Mutation Search Strategy (MSS)–to improve population diversity and prevent premature convergence, thereby enhancing both exploration and exploitation capabilities. These enhancements enable more accurate and stable identification of cluster centers. The effectiveness of IWO is validated through extensive experiments on multiple benchmark clustering datasets and compared against several state-of-the-art metaheuristic algorithms, including PSO, GWO, AOA, and others. The results demonstrate that IWO achieves better results, indicating improved compactness and separation of clusters. Statistical validation using p-values and ranking scores further confirms the superiority of the proposed method. These findings suggest that IWO offers a robust and flexible framework for solving complex clustering problems. Future work will explore hybrid deep learning-integrated models and parallel implementations to enhance scalability.