Xiaolong Han , Yu Xue , Zehong Wang , Yong Zhang , Anton Muravev , Moncef Gabbouj
{"title":"SaDENAS: A self-adaptive differential evolution algorithm for neural architecture search","authors":"Xiaolong Han , Yu Xue , Zehong Wang , Yong Zhang , Anton Muravev , Moncef Gabbouj","doi":"10.1016/j.swevo.2024.101736","DOIUrl":null,"url":null,"abstract":"<div><p>Evolutionary neural architecture search (ENAS) and differentiable architecture search (DARTS) are all prominent algorithms in neural architecture search, enabling the automated design of deep neural networks. To leverage the strengths of both methods, there exists a framework called continuous ENAS, which alternates between using gradient descent to optimize the supernet and employing evolutionary algorithms to optimize the architectural encodings. However, in continuous ENAS, there exists a premature convergence issue accompanied by the small model trap, which is a common issue in NAS. To address this issue, this paper proposes a self-adaptive differential evolution algorithm for neural architecture search (SaDENAS), which can reduce the interference caused by small models to other individuals during the optimization process, thereby avoiding premature convergence. Specifically, SaDENAS treats architectures within the search space as architectural encodings, leveraging vector differences between encodings as the basis for evolutionary operators. To achieve a trade-off between exploration and exploitation, we integrate both local and global search strategies with a mutation scaling factor to adaptively balance these two strategies. Empirical findings demonstrate that our proposed algorithm achieves better performance with superior convergence compared to other algorithms.</p></div>","PeriodicalId":48682,"journal":{"name":"Swarm and Evolutionary Computation","volume":"91 ","pages":"Article 101736"},"PeriodicalIF":8.2000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Swarm and Evolutionary Computation","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2210650224002748","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Evolutionary neural architecture search (ENAS) and differentiable architecture search (DARTS) are all prominent algorithms in neural architecture search, enabling the automated design of deep neural networks. To leverage the strengths of both methods, there exists a framework called continuous ENAS, which alternates between using gradient descent to optimize the supernet and employing evolutionary algorithms to optimize the architectural encodings. However, in continuous ENAS, there exists a premature convergence issue accompanied by the small model trap, which is a common issue in NAS. To address this issue, this paper proposes a self-adaptive differential evolution algorithm for neural architecture search (SaDENAS), which can reduce the interference caused by small models to other individuals during the optimization process, thereby avoiding premature convergence. Specifically, SaDENAS treats architectures within the search space as architectural encodings, leveraging vector differences between encodings as the basis for evolutionary operators. To achieve a trade-off between exploration and exploitation, we integrate both local and global search strategies with a mutation scaling factor to adaptively balance these two strategies. Empirical findings demonstrate that our proposed algorithm achieves better performance with superior convergence compared to other algorithms.
期刊介绍:
Swarm and Evolutionary Computation is a pioneering peer-reviewed journal focused on the latest research and advancements in nature-inspired intelligent computation using swarm and evolutionary algorithms. It covers theoretical, experimental, and practical aspects of these paradigms and their hybrids, promoting interdisciplinary research. The journal prioritizes the publication of high-quality, original articles that push the boundaries of evolutionary computation and swarm intelligence. Additionally, it welcomes survey papers on current topics and novel applications. Topics of interest include but are not limited to: Genetic Algorithms, and Genetic Programming, Evolution Strategies, and Evolutionary Programming, Differential Evolution, Artificial Immune Systems, Particle Swarms, Ant Colony, Bacterial Foraging, Artificial Bees, Fireflies Algorithm, Harmony Search, Artificial Life, Digital Organisms, Estimation of Distribution Algorithms, Stochastic Diffusion Search, Quantum Computing, Nano Computing, Membrane Computing, Human-centric Computing, Hybridization of Algorithms, Memetic Computing, Autonomic Computing, Self-organizing systems, Combinatorial, Discrete, Binary, Constrained, Multi-objective, Multi-modal, Dynamic, and Large-scale Optimization.