{"title":"A Tri-Objective Method for Bi-Objective Feature Selection in Classification","authors":"Ruwang Jiao;Bing Xue;Mengjie Zhang","doi":"10.1162/evco_a_00339","DOIUrl":"10.1162/evco_a_00339","url":null,"abstract":"Minimizing the number of selected features and maximizing the classification performance are two main objectives in feature selection, which can be formulated as a bi-objective optimization problem. Due to the complex interactions between features, a solution (i.e., feature subset) with poor objective values does not mean that all the features it selects are useless, as some of them combined with other complementary features can greatly improve the classification performance. Thus, it is necessary to consider not only the performance of feature subsets in the objective space, but also their differences in the search space, to explore more promising feature combinations. To this end, this paper proposes a tri-objective method for bi-objective feature selection in classification, which solves a bi-objective feature selection problem as a tri-objective problem by considering the diversity (differences) between feature subsets in the search space as the third objective. The selection based on the converted tri-objective method can maintain a balance between minimizing the number of selected features, maximizing the classification performance, and exploring more promising feature subsets. Furthermore, a novel initialization strategy and an offspring reproduction operator are proposed to promote the diversity of feature subsets in the objective space and improve the search ability, respectively. The proposed algorithm is compared with five multiobjective-based feature selection methods, six typical feature selection methods, and two peer methods with diversity as a helper objective. Experimental results on 20 real-world classification datasets suggest that the proposed method outperforms the compared methods in most scenarios.","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":"32 3","pages":"217-248"},"PeriodicalIF":4.6,"publicationDate":"2024-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9822009","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jacob de Nobel;Furong Ye;Diederick Vermetten;Hao Wang;Carola Doerr;Thomas Bäck
{"title":"IOHexperimenter: Benchmarking Platform for Iterative Optimization Heuristics","authors":"Jacob de Nobel;Furong Ye;Diederick Vermetten;Hao Wang;Carola Doerr;Thomas Bäck","doi":"10.1162/evco_a_00342","DOIUrl":"10.1162/evco_a_00342","url":null,"abstract":"We present IOHexperimenter, the experimentation module of the IOHprofiler project. IOHexperimenter aims at providing an easy-to-use and customizable toolbox for benchmarking iterative optimization heuristics such as local search, evolutionary and genetic algorithms, and Bayesian optimization techniques. IOHexperimenter can be used as a stand-alone tool or as part of a benchmarking pipeline that uses other modules of the IOHprofiler environment. IOHexperimenter provides an efficient interface between optimization problems and their solvers while allowing for granular logging of the optimization process. Its logs are fully compatible with existing tools for interactive data analysis, which significantly speeds up the deployment of a benchmarking pipeline. The main components of IOHexperimenter are the environment to build customized problem suites and the various logging options that allow users to steer the granularity of the data records.","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":"32 3","pages":"205-210"},"PeriodicalIF":4.6,"publicationDate":"2024-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9862561","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Pflacco: Feature-Based Landscape Analysis of Continuous and Constrained Optimization Problems in Python","authors":"Raphael Patrick Prager;Heike Trautmann","doi":"10.1162/evco_a_00341","DOIUrl":"10.1162/evco_a_00341","url":null,"abstract":"The herein proposed Python package pflacco provides a set of numerical features to characterize single-objective continuous and constrained optimization problems. Thereby, pflacco addresses two major challenges in the area of optimization. Firstly, it provides the means to develop an understanding of a given problem instance, which is crucial for designing, selecting, or configuring optimization algorithms in general. Secondly, these numerical features can be utilized in the research streams of automated algorithm selection and configuration. While the majority of these landscape features are already available in the R package flacco, our Python implementation offers these tools to an even wider audience and thereby promotes research interests and novel avenues in the area of optimization.","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":"32 3","pages":"211-216"},"PeriodicalIF":4.6,"publicationDate":"2024-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9867698","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ana Kostovska, Diederick Vermetten, Peter Korošec, Sašo Džeroski, Carola Doerr, Tome Eftimov
{"title":"Using Machine Learning Methods to Assess Module Performance Contribution in Modular Optimization Frameworks.","authors":"Ana Kostovska, Diederick Vermetten, Peter Korošec, Sašo Džeroski, Carola Doerr, Tome Eftimov","doi":"10.1162/evco_a_00356","DOIUrl":"https://doi.org/10.1162/evco_a_00356","url":null,"abstract":"<p><p>Modular algorithm frameworks not only allow for combinations never tested in manually selected algorithm portfolios, but they also provide a structured approach to assess which algorithmic ideas are crucial for the observed performance of algorithms. In this study, we propose a methodology for analyzing the impact of the different modules on the overall performance. We consider modular frameworks for two widely used families of derivative-free black-box optimization algorithms, the Covariance Matrix Adaptation Evolution Strategy (CMA-ES) and differential evolution (DE). More specifically, we use performance data of 324 modCMA-ES and 576 modDE algorithm variants (with each variant corresponding to a specific configuration of modules) obtained on the 24 BBOB problems for 6 different runtime budgets in 2 dimensions. Our analysis of these data reveals that the impact of individual modules on overall algorithm performance varies significantly. Notably, among the examined modules, the elitism module in CMA-ES and the linear population size reduction module in DE exhibit the most significant impact on performance. Furthermore, our exploratory data analysis of problem landscape data suggests that the most relevant landscape features remain consistent regardless of the configuration of individual modules, but the influence that these features have on regression accuracy varies. In addition, we apply classifiers that exploit feature importance with respect to the trained models for performance prediction and performance data, to predict the modular configurations of CMA-ES and DE algorithm variants. The results show that the predicted configurations do not exhibit a statistically significant difference in performance compared to the true configurations, with the percentage varying depending on the setup (from 49.1% to 95.5% for mod-CMA and 21.7% to 77.1% for DE).</p>","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":" ","pages":"1-27"},"PeriodicalIF":4.6,"publicationDate":"2024-08-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141890747","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Neural Architecture Search Using Covariance Matrix Adaptation Evolution Strategy","authors":"Nilotpal Sinha;Kuan-Wen Chen","doi":"10.1162/evco_a_00331","DOIUrl":"10.1162/evco_a_00331","url":null,"abstract":"Evolution-based neural architecture search methods have shown promising results, but they require high computational resources because these methods involve training each candidate architecture from scratch and then evaluating its fitness, which results in long search time. Covariance Matrix Adaptation Evolution Strategy (CMA-ES) has shown promising results in tuning hyperparameters of neural networks but has not been used for neural architecture search. In this work, we propose a framework called CMANAS which applies the faster convergence property of CMA-ES to the deep neural architecture search problem. Instead of training each individual architecture seperately, we used the accuracy of a trained one shot model (OSM) on the validation data as a prediction of the fitness of the architecture, resulting in reduced search time. We also used an architecture-fitness table (AF table) for keeping a record of the already evaluated architecture, thus further reducing the search time. The architectures are modeled using a normal distribution, which is updated using CMA-ES based on the fitness of the sampled population. Experimentally, CMANAS achieves better results than previous evolution-based methods while reducing the search time significantly. The effectiveness of CMANAS is shown on two different search spaces using four datasets: CIFAR-10, CIFAR-100, ImageNet, and ImageNet16-120. All the results show that CMANAS is a viable alternative to previous evolution-based methods and extends the application of CMA-ES to the deep neural architecture search field.","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":"32 2","pages":"177-204"},"PeriodicalIF":4.6,"publicationDate":"2024-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9424655","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"On Single-Objective Sub-Graph-Based Mutation for Solving the Bi-Objective Minimum Spanning Tree Problem","authors":"Jakob Bossek;Christian Grimme","doi":"10.1162/evco_a_00335","DOIUrl":"10.1162/evco_a_00335","url":null,"abstract":"We contribute to the efficient approximation of the Pareto-set for the classical NP-hard multiobjective minimum spanning tree problem (moMST) adopting evolutionary computation. More precisely, by building upon preliminary work, we analyze the neighborhood structure of Pareto-optimal spanning trees and design several highly biased sub-graph-based mutation operators founded on the gained insights. In a nutshell, these operators replace (un)connected sub-trees of candidate solutions with locally optimal sub-trees. The latter (biased) step is realized by applying Kruskal's single-objective MST algorithm to a weighted sum scalarization of a sub-graph. We prove runtime complexity results for the introduced operators and investigate the desirable Pareto-beneficial property. This property states that mutants cannot be dominated by their parent. Moreover, we perform an extensive experimental benchmark study to showcase the operator's practical suitability. Our results confirm that the sub-graph-based operators beat baseline algorithms from the literature even with severely restricted computational budget in terms of function evaluations on four different classes of complete graphs with different shapes of the Pareto-front.","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":"32 2","pages":"143-175"},"PeriodicalIF":4.6,"publicationDate":"2024-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9967379","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Role of Morphological Variation in Evolutionary Robotics: Maximizing Performance and Robustness","authors":"Jonata Tyska Carvalho;Stefano Nolfi","doi":"10.1162/evco_a_00336","DOIUrl":"10.1162/evco_a_00336","url":null,"abstract":"Exposing an evolutionary algorithm that is used to evolve robot controllers to variable conditions is necessary to obtain solutions which are robust and can cross the reality gap. However, we do not yet have methods for analyzing and understanding the impact of the varying morphological conditions which impact the evolutionary process, and therefore for choosing suitable variation ranges. By morphological conditions, we refer to the starting state of the robot, and to variations in its sensor readings during operation due to noise. In this paper, we introduce a method that permits us to measure the impact of these morphological variations and we analyze the relation between the amplitude of variations, the modality with which they are introduced, and the performance and robustness of evolving agents. Our results demonstrate that (i) the evolutionary algorithm can tolerate morphological variations which have a very high impact, (ii) variations affecting the actions of the agent are tolerated much better than variations affecting the initial state of the agent or of the environment, and (iii) improving the accuracy of the fitness measure through multiple evaluations is not always useful. Moreover, our results show that morphological variations permit generating solutions which perform better both in varying and non-varying conditions.","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":"32 2","pages":"125-142"},"PeriodicalIF":4.6,"publicationDate":"2024-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9726876","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Comparing Robot Controller Optimization Methods on Evolvable Morphologies","authors":"Fuda van Diggelen;Eliseo Ferrante;A. E. Eiben","doi":"10.1162/evco_a_00334","DOIUrl":"10.1162/evco_a_00334","url":null,"abstract":"In this paper, we compare Bayesian Optimization, Differential Evolution, and an Evolution Strategy employed as a gait-learning algorithm in modular robots. The motivational scenario is the joint evolution of morphologies and controllers, where “newborn” robots also undergo a learning process to optimize their inherited controllers (without changing their bodies). This context raises the question: How do gait-learning algorithms compare when applied to various morphologies that are not known in advance (and thus need to be treated as without priors)? To answer this question, we use a test suite of twenty different robot morphologies to evaluate our gait-learners and compare their efficiency, efficacy, and sensitivity to morphological differences. The results indicate that Bayesian Optimization and Differential Evolution deliver the same solution quality (walking speed for the robot) with fewer evaluations than the Evolution Strategy. Furthermore, the Evolution Strategy is more sensitive for morphological differences (its efficacy varies more between different morphologies) and is more subject to luck (repeated runs on the same morphology show greater variance in the outcomes).","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":"32 2","pages":"105-124"},"PeriodicalIF":4.6,"publicationDate":"2024-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9541798","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Editorial for the Special Issue on Reproducibility","authors":"Manuel López-Ibáñez;Luís Paquete;Mike Preuss","doi":"10.1162/evco_e_00344","DOIUrl":"10.1162/evco_e_00344","url":null,"abstract":"","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":"32 1","pages":"1-2"},"PeriodicalIF":4.6,"publicationDate":"2024-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139998205","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Raúl Martín-Santamaría;Sergio Cavero;Alberto Herrán;Abraham Duarte;J. Manuel Colmenar
{"title":"A Practical Methodology for Reproducible Experimentation: An Application to the Double-Row Facility Layout Problem","authors":"Raúl Martín-Santamaría;Sergio Cavero;Alberto Herrán;Abraham Duarte;J. Manuel Colmenar","doi":"10.1162/evco_a_00317","DOIUrl":"10.1162/evco_a_00317","url":null,"abstract":"Reproducibility of experiments is a complex task in stochastic methods such as evolutionary algorithms or metaheuristics in general. Many works from the literature give general guidelines to favor reproducibility. However, none of them provide both a practical set of steps or software tools to help in this process. In this article, we propose a practical methodology to favor reproducibility in optimization problems tackled with stochastic methods. This methodology is divided into three main steps, where the researcher is assisted by software tools which implement state-of-the-art techniques related to this process. The methodology has been applied to study the double-row facility layout problem (DRFLP) where we propose a new algorithm able to obtain better results than the state-of-the-art methods. To this aim, we have also replicated the previous methods in order to complete the study with a new set of larger instances. All the produced artifacts related to the methodology and the study of the target problem are available in Zenodo.","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":"32 1","pages":"69-104"},"PeriodicalIF":4.6,"publicationDate":"2024-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"40695126","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}