{"title":"Computing Optimal Populations for Binary Problems using Logic Minimization.","authors":"Pier Luca Lanzi","doi":"10.1162/EVCO.a.399","DOIUrl":"https://doi.org/10.1162/EVCO.a.399","url":null,"abstract":"<p><p>The study of generalization in XCS has been mainly focused on single-step binary problems for which the optimal (accurate, maximally general) solution is known. In contrast, binary multi-step problems have primarily been studied in terms of performance. However, while there is an intuitive notion of generalization in multi-step environments, the optimal solutions for even the simplest multi-step problems are still unknown. This paper presents an approach to compute the optimal solutions for single-step and multi-step binary problems starting from their tabular solution. We first illustrate the approach using Boolean functions for which the optimal solutions are known. Then, we apply it to compute, for the first time, the optimal solutions for theWoods problems that have been used as a testbed to study XCS behavior in multi-step problems. The solutions confirm early intuitions on simple environments and shed new light on more complex problems. We compare the optimal solutions our approach computes with the condensed populations that XCS can evolve, showing that XCS consistently evolves a number of minimal solutions that increases with the number of learning problems. Finally, we extend our approach to compute the minimal representation of evolving classifier populations and compare the size of the evolved populations before and after condensation with their minimized counterparts.</p>","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":" ","pages":"1-23"},"PeriodicalIF":3.4,"publicationDate":"2026-05-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147845553","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Enhancing Generalization and Scalability for Multi-Objective Optimization with Population Pre-Training.","authors":"Haokai Hong, Liang Feng, Min Jiang, Kay Chen Tan","doi":"10.1162/EVCO.a.394","DOIUrl":"https://doi.org/10.1162/EVCO.a.394","url":null,"abstract":"<p><p>Multi-objective optimization problems (MOPs) require the simultaneous optimization of conflicting objectives. Real-world MOPs often exhibit complex characteristics, including high-dimensional decision spaces, many objectives, or computationally expensive evaluations. While population-based evolutionary computation has shown promise in addressing diverse MOPs through problem-specific adaptations, existing approaches frequently lack generalizability across distinct problem classes. Inspired by pre-training paradigms in machine learning, we propose a Population Pre-trained Model (PPM) that leverages historical optimization knowledge to solve complex MOPs within a unified framework efficiently. PPM models evolutionary patterns via population modeling, addressing two key challenges: (1) handling diverse decision spaces across problems and (2) capturing the interdependency between objective and decision spaces during evolution. To this end, we develop a population transformer architecture that embeds decision spaces of varying scales into a common latent space, enabling knowledge transfer across diverse problems. Furthermore, our architecture integrates objective-space features through objective fusion to enhance population prediction accuracy for complex MOPs. Our approach achieves robust generalization to downstream optimization tasks with up to 5,000 dimensions-five times the training scale and 200 times greater than prior work. Extensive evaluations on standardized benchmarks and out-of-training real-world applications demonstrate the consistent superiority of our method over state-of-the-art algorithms tailored to specific problem classes, improving the performance and generalization of evolutionary computation in solving MOPs.</p>","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":" ","pages":"1-32"},"PeriodicalIF":3.4,"publicationDate":"2026-03-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147610289","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A dynamic multi-objective evolutionary algorithm using dual-space prediction and surrogate-based sampling.","authors":"Tianyu Liu, Xiangfei Wu, He Xu","doi":"10.1162/EVCO.a.393","DOIUrl":"https://doi.org/10.1162/EVCO.a.393","url":null,"abstract":"<p><p>The main challenge in handling dynamic multi-objective optimization problems lies in the need for algorithms to accurately track Pareto-optimal solutions in constantly changing environments. Most existing predictionbased dynamic multi-objective evolutionary algorithms (DMOEAs) conduct prediction either in the decision space or the objective space alone, or apply the same prediction model to both spaces. However, such approaches may fail to fully capture the distinct change patterns of each space, especially under nonlinear and complex environmental dynamics, thereby limiting the effectiveness of these algorithms. Furthermore, when sampling methods are used to help the algorithm generate populations in new environments, a large number of sampled individuals can impose a significant computational burden due to the increased number of function evaluations. To address these limitations, this paper proposes a dynamic multi-objective evolutionary algorithm, namely DS-DMOEA, which efficiently adapts to environmental changes through a dual-space prediction strategy and a surrogate-based sampling strategy. The dual-space prediction strategy captures dynamic changes by employing a weight vector-based method in the objective space and a geodesic flow kernel method in the decision space. Simultaneously, the surrogate-based sampling strategy generates a high-quality sampling population by training surrogate models with information from similar historical environments. The predicted and sampled populations are then combined to form an initial population well-suited for the new environment. DS-DMOEA has been tested against nine state-of-the-art DMOEAs on 19 benchmark problems with three types of environmental change patterns. The experimental results validate the effectiveness of the proposed algorithm.</p>","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":" ","pages":"1-40"},"PeriodicalIF":3.4,"publicationDate":"2026-03-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147357579","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"XCS for Sequential Perceptual Aliasing in Multi-Step Decision Making.","authors":"Fumito Uwano, Will N Browne","doi":"10.1162/EVCO.a.392","DOIUrl":"https://doi.org/10.1162/EVCO.a.392","url":null,"abstract":"<p><p>Sequential perceptual aliasing is a cognitive challenge for learning agents when robots cannot differentiate states and their associations based on immediate observations, leading to poor decision-making. Existing systems struggle to abstract and distinguish observations effectively to achieve policy learning. This paper addresses this issue by introducing new aliasing types within the context of sequential aliasing and proposing an enhanced XCS classifier system that learns using a complete state-action map. The proposed system called hierarchical Frames-of-References-based XCS (Hi-FoRsXCS), can concatenate sequences of aliased states with the same observation into a chain. Hi- FoRsXCS then predicts associations between the observations and aliased states using the ends of the chain, enabling optimal policy learning with a complete action map. Experimental results demonstrate that Hi-FoRsXCS outperforms the existing systems in terms of accuracy. However, the limitations of Hi-FoRsXCS will be discussed in this paper.</p>","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":" ","pages":"1-28"},"PeriodicalIF":3.4,"publicationDate":"2026-03-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147357574","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Solving Many-Objective Optimization Problems Based on PF Shape Classification and Vector Angle Selection","authors":"Y. T. Wu;F. Z. Ge;D. B. Chen;L. Shi","doi":"10.1162/evco_a_00373","DOIUrl":"10.1162/evco_a_00373","url":null,"abstract":"Most many-objective optimization algorithms (MaOEAs) adopt a pre-assumed Pareto front (PF) shape, instead of the true PF shape, to balance convergence and diversity in high-dimensional objective space, resulting in insufficient selection pressure and poor performance. To address these shortcomings, we propose MaOEA-PV based on PF shape classification and vector angle selection. The three innovation points of this paper are as follows: (i) a new method for PF classification; (ii) a new fitness function that combines convergence and diversity indicators, thereby enhancing the quality of parents during mating selection; and (iii) the selection of individuals exhibiting the best convergence to add to the population, overcoming the lack of selection pressure during environmental selection. Subsequently, the max-min vector angle strategy is employed. The solutions with the highest diversity and the least convergence are selected based on the max and min vector angles, respectively, which balances convergence and diversity. The performance of algorithm is compared with those of five state-of-the-art MaOEAs on 41 test problems and 5 real-world problems comprising as many 15 objectives. The experimental results demonstrate the competitive and effective nature of the proposed algorithm.","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":"34 1","pages":"53-101"},"PeriodicalIF":3.4,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143651776","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nicolás E. García-Pedrajas;José M. Cuevas-Muñoz;Aida de Haro-García
{"title":"BlindSMOTE: Synthetic Minority Oversampling Based Only on Evolutionary Computation","authors":"Nicolás E. García-Pedrajas;José M. Cuevas-Muñoz;Aida de Haro-García","doi":"10.1162/evco_a_00374","DOIUrl":"10.1162/evco_a_00374","url":null,"abstract":"One of the most common problems in data mining applications is the uneven distribution of classes, which appears in many real-world scenarios. The class of interest is often highly underrepresented in the given dataset, which harms the performance of most classifiers. One of the most successful methods for addressing the class imbalance problem is to oversample the minority class using synthetic samples. Since the original algorithm, the synthetic minority oversampling technique (SMOTE), introduced this method, numerous versions have emerged, each of which is based on a specific hypothesis about where and how to generate new synthetic instances. In this paper, we propose a different approach based exclusively on evolutionary computation that imposes no constraints on the creation of new synthetic instances. Majority class undersampling is also incorporated into the evolutionary process. A thorough comparison involving three classification methods, 85 datasets, and more than 90 class-imbalance strategies shows the advantages of our proposal.","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":"34 1","pages":"103-135"},"PeriodicalIF":3.4,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144023903","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jeroen G. Rook;Carolin Benjamins;Jakob Bossek;Heike Trautmann;Holger H. Hoos;Marius Lindauer
{"title":"MO-SMAC: Multiobjective Sequential Model-Based Algorithm Configuration","authors":"Jeroen G. Rook;Carolin Benjamins;Jakob Bossek;Heike Trautmann;Holger H. Hoos;Marius Lindauer","doi":"10.1162/evco_a_00371","DOIUrl":"10.1162/evco_a_00371","url":null,"abstract":"Automated algorithm configuration aims at finding well-performing parameter configurations for a given problem, and it has proven to be effective within many AI domains, including evolutionary computation. Initially, the focus was on excelling in one performance objective, but, in reality, most tasks have a variety of (conflicting) objectives. The surging demand for trustworthy and resource-efficient AI systems makes this multiobjective perspective even more prevalent. We propose a new general-purpose multiobjective automated algorithm configurator by extending the widely-used SMAC framework. Instead of finding a single configuration, we search for a nondominated set that approximates the actual Pareto set. We propose a pure multiobjective Bayesian optimization approach for obtaining promising configurations by using the predicted hypervolume improvement as acquisition function. We also present a novel intensification procedure to efficiently handle the selection of configurations in a multiobjective context. Our approach is empirically validated and compared across various configuration scenarios in four AI domains, demonstrating superiority over baseline methods, competitiveness with MO-ParamILS on individual scenarios, and an overall best performance.","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":"34 1","pages":"29-52"},"PeriodicalIF":3.4,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143651775","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Cost of Randomness in Evolutionary Algorithms: Crossover Can Save Random Bits","authors":"Carlo Kneissl;Dirk Sudholt","doi":"10.1162/evco_a_00365","DOIUrl":"10.1162/evco_a_00365","url":null,"abstract":"Evolutionary algorithms make countless random decisions during selection, mutation, and crossover operations. These random decisions require a steady stream of random numbers. We analyze the expected number of random bits used throughout a run of an evolutionary algorithm and refer to this as the cost of randomness. We give general bounds on the cost of randomness for mutation-based evolutionary algorithms using 1-bit flips or standard mutations using either a naive or a common, more efficient implementation that uses Θ(logn) random bits per mutation. Uniform crossover is a potentially wasteful operator as the number of random bits used equals the Hamming distance of the two parents, which can be up to n. However, we show for a (2+1) genetic algorithm that is known to optimize the test function OneMax in roughly (e/2)nlnn expected evaluations, twice as fast as the fastest mutation-based evolutionary algorithms, that the total cost of randomness during all crossover operations on OneMax is only Θ(n). A more pronounced effect is shown for the common test function Jumpk, where there is an asymptotic decrease both in the number of evaluations and in the cost of randomness. Consequently, the use of crossover can reduce the cost of randomness below that of the fastest evolutionary algorithms that only use standard mutations.","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":"34 1","pages":"1-28"},"PeriodicalIF":3.4,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143015553","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Genetic Programming with Tabu List for Dynamic Flexible Job Shop Scheduling","authors":"Fangfang Zhang;Mazhar Ansari Ardeh;Yi Mei;Mengjie Zhang","doi":"10.1162/EVCO.a.26","DOIUrl":"10.1162/EVCO.a.26","url":null,"abstract":"Dynamic flexible job shop scheduling (DFJSS) is an important combinatorial optimisation problem, requiring simultaneous decision-making for machine assignment and operation sequencing in dynamic environments. Genetic programming (GP), as a hyper-heuristic approach, has been extensively employed for acquiring scheduling heuristics for DFJSS. A drawback of GP for DFJSS is that GP has weak exploration ability indicated by its quick diversity loss during the evolutionary process. This paper proposes an effective GP algorithm with tabu lists to capture the information of explored areas and guide GP to explore more unexplored areas to improve GP’s exploration ability for enhancing GP’s effectiveness. First, we use phenotypic characterisation to represent the behaviour of tree-based GP individuals for DFJSS as vectors. Then, we build tabu lists that contain phenotypic characterisations of explored individuals at the current generation and across generations, respectively. Finally, newly generated offspring are compared with the individuals’ phenotypic characterisations in the built tabu lists. If an individual is unseen in the tabu lists, it will be kept to form the new population at the next generation. Otherwise, it will be discarded. We have examined the proposed GP algorithm in nine different scenarios. The findings indicate that the proposed algorithm outperforms the compared algorithms in the majority of scenarios. The proposed algorithm can maintain a diverse and well-distributed population during the evolutionary process of GP. Further analyses show that the proposed algorithm does cover a large search area to find effective scheduling heuristics by focusing on unseen individuals.","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":"34 1","pages":"137-165"},"PeriodicalIF":3.4,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144081815","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M Affenzeller, S M Winkler, A V Kononova, H Trautmann, T Tušar, P Machado, T Bäck
{"title":"Editorial of the Special Issue: Parallel Problem Solving from Nature PPSN 2024 Extended Versions of Best Paper Candidates.","authors":"M Affenzeller, S M Winkler, A V Kononova, H Trautmann, T Tušar, P Machado, T Bäck","doi":"10.1162/EVCO.a.383","DOIUrl":"https://doi.org/10.1162/EVCO.a.383","url":null,"abstract":"","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":" ","pages":"1-2"},"PeriodicalIF":3.4,"publicationDate":"2026-02-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146183270","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}