{"title":"Large-scale optimization: Are co-operative co-evolution and fitness inheritance additive?","authors":"A. Hameed, D. Corne, David Morgan, A. Waldock","doi":"10.1109/UKCI.2013.6651294","DOIUrl":null,"url":null,"abstract":"Large-scale optimization - here referring mainly to problems with many design parameters - remains a serious challenge for optimization algorithms. When the problem at hand does not succumb to analytical treatment (an overwhelmingly commonplace situation), the engineering and adaptation of stochastic black box optimization methods tends to be a favoured approach, particularly the use of Evolutionary Algorithms (EAs). In this context, many approaches are currently under investigation for accelerating performance on large-scale problems, and we focus on two of those in this paper. The first is co-operative co-evolution (CC), where the strategy is to successively optimize only subsets of the design parameters at a time, keeping the remainder fixed, with an organized approach to managing and reconciling these `subspace' optimizations. The second is fitness inheritance (FI), which is essentially a very simple surrogate model strategy, in which, with some probability, the fitness of a solution is simply guessed to be a simple function of the fitnesses of that solution's `parents'. Both CC and FI have been found successful on nontrivial and multiple test cases, and they use fundamentally distinct strategies. In this article we explore the extent to which employing both of these strategies at once provides additional benefit. Based on experiments with 50D-1000D variants of four test functions, we find `CCEA-FI' to be highly effective, especially when a random grouping scheme is used in the CC component.","PeriodicalId":106191,"journal":{"name":"2013 13th UK Workshop on Computational Intelligence (UKCI)","volume":"730 ","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 13th UK Workshop on Computational Intelligence (UKCI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/UKCI.2013.6651294","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
Large-scale optimization - here referring mainly to problems with many design parameters - remains a serious challenge for optimization algorithms. When the problem at hand does not succumb to analytical treatment (an overwhelmingly commonplace situation), the engineering and adaptation of stochastic black box optimization methods tends to be a favoured approach, particularly the use of Evolutionary Algorithms (EAs). In this context, many approaches are currently under investigation for accelerating performance on large-scale problems, and we focus on two of those in this paper. The first is co-operative co-evolution (CC), where the strategy is to successively optimize only subsets of the design parameters at a time, keeping the remainder fixed, with an organized approach to managing and reconciling these `subspace' optimizations. The second is fitness inheritance (FI), which is essentially a very simple surrogate model strategy, in which, with some probability, the fitness of a solution is simply guessed to be a simple function of the fitnesses of that solution's `parents'. Both CC and FI have been found successful on nontrivial and multiple test cases, and they use fundamentally distinct strategies. In this article we explore the extent to which employing both of these strategies at once provides additional benefit. Based on experiments with 50D-1000D variants of four test functions, we find `CCEA-FI' to be highly effective, especially when a random grouping scheme is used in the CC component.