Wenji Li , Yifeng Qiu , Zhaojun Wang , Biao Xu , Zhifeng Hao , Qingfu Zhang , Yun Li , Zhun Fan
{"title":"昂贵约束多目标问题的代理辅助神经学习和进化优化","authors":"Wenji Li , Yifeng Qiu , Zhaojun Wang , Biao Xu , Zhifeng Hao , Qingfu Zhang , Yun Li , Zhun Fan","doi":"10.1016/j.swevo.2025.102020","DOIUrl":null,"url":null,"abstract":"<div><div>Expensive constrained multi-objective optimization problems (ECMOPs) present significant challenges due to the high computational cost of evaluating objective and constraint functions, which severely limits the number of feasible function evaluations. To address this issue, we propose an efficient surrogate-assisted constrained multi-objective evolutionary algorithm, named LEMO. LEMO integrates neural learning with a novel constraint screening strategy to dynamically construct surrogate models for the most relevant constraints. During the optimization process, a neural network is designed to learn the mapping between arbitrary weight vectors and their corresponding constrained Pareto optimal solutions. This enables the generation of high-quality solutions while requiring fewer expensive function evaluations. Additionally, a constraint screening mechanism is introduced to dynamically exclude constraints that are irrelevant to the current search phase, thus simplifying the surrogate models and improving the efficiency of the constrained search process. To evaluate the effectiveness of LEMO, we compare its performance against seven state-of-the-art algorithms on three benchmark suites, LIRCMOP, DASCMOP, and MW, as well as a real-world optimization problem. The experimental results demonstrate that LEMO consistently outperforms these algorithms in both computational efficiency and solution quality.</div></div>","PeriodicalId":48682,"journal":{"name":"Swarm and Evolutionary Computation","volume":"97 ","pages":"Article 102020"},"PeriodicalIF":8.5000,"publicationDate":"2025-06-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Surrogate-assisted neural learning and evolutionary optimization for expensive constrained multi-objective problems\",\"authors\":\"Wenji Li , Yifeng Qiu , Zhaojun Wang , Biao Xu , Zhifeng Hao , Qingfu Zhang , Yun Li , Zhun Fan\",\"doi\":\"10.1016/j.swevo.2025.102020\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Expensive constrained multi-objective optimization problems (ECMOPs) present significant challenges due to the high computational cost of evaluating objective and constraint functions, which severely limits the number of feasible function evaluations. To address this issue, we propose an efficient surrogate-assisted constrained multi-objective evolutionary algorithm, named LEMO. LEMO integrates neural learning with a novel constraint screening strategy to dynamically construct surrogate models for the most relevant constraints. During the optimization process, a neural network is designed to learn the mapping between arbitrary weight vectors and their corresponding constrained Pareto optimal solutions. This enables the generation of high-quality solutions while requiring fewer expensive function evaluations. Additionally, a constraint screening mechanism is introduced to dynamically exclude constraints that are irrelevant to the current search phase, thus simplifying the surrogate models and improving the efficiency of the constrained search process. To evaluate the effectiveness of LEMO, we compare its performance against seven state-of-the-art algorithms on three benchmark suites, LIRCMOP, DASCMOP, and MW, as well as a real-world optimization problem. The experimental results demonstrate that LEMO consistently outperforms these algorithms in both computational efficiency and solution quality.</div></div>\",\"PeriodicalId\":48682,\"journal\":{\"name\":\"Swarm and Evolutionary Computation\",\"volume\":\"97 \",\"pages\":\"Article 102020\"},\"PeriodicalIF\":8.5000,\"publicationDate\":\"2025-06-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Swarm and Evolutionary Computation\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2210650225001786\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Swarm and Evolutionary Computation","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2210650225001786","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Surrogate-assisted neural learning and evolutionary optimization for expensive constrained multi-objective problems
Expensive constrained multi-objective optimization problems (ECMOPs) present significant challenges due to the high computational cost of evaluating objective and constraint functions, which severely limits the number of feasible function evaluations. To address this issue, we propose an efficient surrogate-assisted constrained multi-objective evolutionary algorithm, named LEMO. LEMO integrates neural learning with a novel constraint screening strategy to dynamically construct surrogate models for the most relevant constraints. During the optimization process, a neural network is designed to learn the mapping between arbitrary weight vectors and their corresponding constrained Pareto optimal solutions. This enables the generation of high-quality solutions while requiring fewer expensive function evaluations. Additionally, a constraint screening mechanism is introduced to dynamically exclude constraints that are irrelevant to the current search phase, thus simplifying the surrogate models and improving the efficiency of the constrained search process. To evaluate the effectiveness of LEMO, we compare its performance against seven state-of-the-art algorithms on three benchmark suites, LIRCMOP, DASCMOP, and MW, as well as a real-world optimization problem. The experimental results demonstrate that LEMO consistently outperforms these algorithms in both computational efficiency and solution quality.
期刊介绍:
Swarm and Evolutionary Computation is a pioneering peer-reviewed journal focused on the latest research and advancements in nature-inspired intelligent computation using swarm and evolutionary algorithms. It covers theoretical, experimental, and practical aspects of these paradigms and their hybrids, promoting interdisciplinary research. The journal prioritizes the publication of high-quality, original articles that push the boundaries of evolutionary computation and swarm intelligence. Additionally, it welcomes survey papers on current topics and novel applications. Topics of interest include but are not limited to: Genetic Algorithms, and Genetic Programming, Evolution Strategies, and Evolutionary Programming, Differential Evolution, Artificial Immune Systems, Particle Swarms, Ant Colony, Bacterial Foraging, Artificial Bees, Fireflies Algorithm, Harmony Search, Artificial Life, Digital Organisms, Estimation of Distribution Algorithms, Stochastic Diffusion Search, Quantum Computing, Nano Computing, Membrane Computing, Human-centric Computing, Hybridization of Algorithms, Memetic Computing, Autonomic Computing, Self-organizing systems, Combinatorial, Discrete, Binary, Constrained, Multi-objective, Multi-modal, Dynamic, and Large-scale Optimization.