{"title":"A novel adaptive optimization scheme for advancing metaheuristics and global optimization","authors":"Majid Ilchi Ghazaan , Amirmohammad Salmani Oshnari , Amirhossein Salmani Oshnari","doi":"10.1016/j.swevo.2024.101779","DOIUrl":null,"url":null,"abstract":"<div><div>Metaheuristics have been the dominant approach for tackling complex optimization challenges across diverse disciplines. Numerous studies have sought to enhance the performance of existing metaheuristics by identifying their limitations and modifying their frameworks. Despite these efforts, many resulting strategies remain overly complex, often narrowly focused on a single algorithm and a specific problem domain. In this study, we introduce a novel adaptive optimization scheme (AOS) designed as an algorithm-independent mechanism for enhancing the performance of metaheuristics by addressing various optimization challenges. This scheme is developed through a comprehensive integration of three substructures, each aimed at mitigating common deficiencies in metaheuristics across three optimization pillars: high exploration capabilities, effective avoidance of local optima, and strong exploitation capabilities. Three prominent approaches—Lévy Flights, Chaotic Local Search, and Opposition-based Learning—are skillfully combined to overcome these shortcomings in various metaheuristic algorithms, establishing a straightforward unit. Through rigorous testing on 50 diverse mathematical benchmark functions, we assessed the performance of original metaheuristics and their AOS-upgraded versions. The results confirm that the proposed AOS consistently elevates algorithmic effectiveness across multiple optimization metrics. Notably, four AOS-upgraded algorithms—EO-AOS, HBA-AOS, DBO-AOS, and PSO-AOS—emerge as the leading performers among the 16 algorithms under evaluation. Comparisons between the upgraded and baseline metaheuristics further reveal the substantial impact of AOS, as each upgraded variant demonstrably surpasses its original algorithm in various optimization capabilities.</div></div>","PeriodicalId":48682,"journal":{"name":"Swarm and Evolutionary Computation","volume":"91 ","pages":"Article 101779"},"PeriodicalIF":8.2000,"publicationDate":"2024-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Swarm and Evolutionary Computation","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2210650224003171","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Metaheuristics have been the dominant approach for tackling complex optimization challenges across diverse disciplines. Numerous studies have sought to enhance the performance of existing metaheuristics by identifying their limitations and modifying their frameworks. Despite these efforts, many resulting strategies remain overly complex, often narrowly focused on a single algorithm and a specific problem domain. In this study, we introduce a novel adaptive optimization scheme (AOS) designed as an algorithm-independent mechanism for enhancing the performance of metaheuristics by addressing various optimization challenges. This scheme is developed through a comprehensive integration of three substructures, each aimed at mitigating common deficiencies in metaheuristics across three optimization pillars: high exploration capabilities, effective avoidance of local optima, and strong exploitation capabilities. Three prominent approaches—Lévy Flights, Chaotic Local Search, and Opposition-based Learning—are skillfully combined to overcome these shortcomings in various metaheuristic algorithms, establishing a straightforward unit. Through rigorous testing on 50 diverse mathematical benchmark functions, we assessed the performance of original metaheuristics and their AOS-upgraded versions. The results confirm that the proposed AOS consistently elevates algorithmic effectiveness across multiple optimization metrics. Notably, four AOS-upgraded algorithms—EO-AOS, HBA-AOS, DBO-AOS, and PSO-AOS—emerge as the leading performers among the 16 algorithms under evaluation. Comparisons between the upgraded and baseline metaheuristics further reveal the substantial impact of AOS, as each upgraded variant demonstrably surpasses its original algorithm in various optimization capabilities.
元启发式算法一直是应对各学科复杂优化挑战的主要方法。大量研究试图通过识别现有元启发式算法的局限性并修改其框架来提高其性能。尽管做出了这些努力,但许多由此产生的策略仍然过于复杂,往往狭隘地专注于单一算法和特定问题领域。在本研究中,我们介绍了一种新颖的自适应优化方案(AOS),它是一种与算法无关的机制,可通过解决各种优化难题来提高元启发式算法的性能。该方案是通过对三个子结构的全面整合而开发的,每个子结构都旨在减轻元启发式算法在三个优化支柱方面的共同缺陷:高探索能力、有效避免局部最优以及强大的利用能力。我们巧妙地结合了三种著名的方法--莱维飞行(Lévy Flights)、混沌局部搜索(Chaotic Local Search)和对立学习(Opposition-based Learning),克服了各种元启发式算法中的这些缺陷,建立了一个简单明了的单元。通过对 50 种不同数学基准函数的严格测试,我们评估了原始元启发式算法及其 AOS 升级版本的性能。结果证实,所提出的 AOS 在多个优化指标上持续提升了算法的有效性。值得注意的是,四种经过 AOS 升级的算法--EO-AOS、HBA-AOS、DBO-AOS 和 PSO-AOS--在接受评估的 16 种算法中表现突出。升级后的元启发式算法与基线元启发式算法之间的比较进一步揭示了 AOS 的重大影响,因为每种升级后的变体在各种优化能力上都明显超过了其原始算法。
期刊介绍:
Swarm and Evolutionary Computation is a pioneering peer-reviewed journal focused on the latest research and advancements in nature-inspired intelligent computation using swarm and evolutionary algorithms. It covers theoretical, experimental, and practical aspects of these paradigms and their hybrids, promoting interdisciplinary research. The journal prioritizes the publication of high-quality, original articles that push the boundaries of evolutionary computation and swarm intelligence. Additionally, it welcomes survey papers on current topics and novel applications. Topics of interest include but are not limited to: Genetic Algorithms, and Genetic Programming, Evolution Strategies, and Evolutionary Programming, Differential Evolution, Artificial Immune Systems, Particle Swarms, Ant Colony, Bacterial Foraging, Artificial Bees, Fireflies Algorithm, Harmony Search, Artificial Life, Digital Organisms, Estimation of Distribution Algorithms, Stochastic Diffusion Search, Quantum Computing, Nano Computing, Membrane Computing, Human-centric Computing, Hybridization of Algorithms, Memetic Computing, Autonomic Computing, Self-organizing systems, Combinatorial, Discrete, Binary, Constrained, Multi-objective, Multi-modal, Dynamic, and Large-scale Optimization.