{"title":"Comma Selection Outperforms Plus Selection on OneMax with Randomly Planted Optima","authors":"Joost Jorritsma, Johannes Lengler, Dirk Sudholt","doi":"10.1007/s00453-025-01330-y","DOIUrl":null,"url":null,"abstract":"<div><p>Evolutionary algorithms (EAs) are general-purpose optimisation algorithms that maintain a population (multiset) of candidate solutions and apply variation operators to create new solutions called offspring. A new population is typically formed using one of two strategies: a <span>\\((\\mu +\\lambda )\\)</span> EA (plus selection) keeps the best <span>\\(\\mu \\)</span> search points out of the union of <span>\\(\\mu \\)</span> parents in the old population and <span>\\(\\lambda \\)</span> offspring, whereas a <span>\\((\\mu ,\\lambda )\\)</span> EA (comma selection) discards all parents and only keeps the best <span>\\(\\mu \\)</span> out of <span>\\(\\lambda \\)</span> offspring. Comma selection may help to escape from local optima, however when and how it is beneficial is subject to an ongoing debate. We propose a new benchmark function to investigate the benefits of comma selection: the well known benchmark function <span>OneMax</span>with randomly planted local optima, generated by frozen noise. We show that comma selection (the <span>\\({(1,\\lambda )}\\)</span> EA) is faster than plus selection (the <span>\\({(1+\\lambda )}\\)</span> EA) on this benchmark, in a fixed-target scenario, and for offspring population sizes <span>\\(\\lambda \\)</span> for which both algorithms behave differently. For certain parameters, the <span>\\({(1,\\lambda )}\\)</span> EAfinds the target in <span>\\(\\Theta (n \\ln n)\\)</span> evaluations, with high probability (w.h.p.), while the <span>\\({(1+\\lambda )}\\)</span> EAw.h.p. requires <span>\\(\\omega (n^2)\\)</span> evaluations. We further show that the advantage of comma selection is not arbitrarily large: w.h.p. comma selection outperforms plus selection at most by a factor of <span>\\(O(n \\ln n)\\)</span> for most reasonable parameter choices. We develop novel methods for analysing frozen noise and give powerful and general fixed-target results with tail bounds that are of independent interest.</p></div>","PeriodicalId":50824,"journal":{"name":"Algorithmica","volume":"87 12","pages":"1804 - 1863"},"PeriodicalIF":0.7000,"publicationDate":"2025-08-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s00453-025-01330-y.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Algorithmica","FirstCategoryId":"94","ListUrlMain":"https://link.springer.com/article/10.1007/s00453-025-01330-y","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
引用次数: 0
Abstract
Evolutionary algorithms (EAs) are general-purpose optimisation algorithms that maintain a population (multiset) of candidate solutions and apply variation operators to create new solutions called offspring. A new population is typically formed using one of two strategies: a \((\mu +\lambda )\) EA (plus selection) keeps the best \(\mu \) search points out of the union of \(\mu \) parents in the old population and \(\lambda \) offspring, whereas a \((\mu ,\lambda )\) EA (comma selection) discards all parents and only keeps the best \(\mu \) out of \(\lambda \) offspring. Comma selection may help to escape from local optima, however when and how it is beneficial is subject to an ongoing debate. We propose a new benchmark function to investigate the benefits of comma selection: the well known benchmark function OneMaxwith randomly planted local optima, generated by frozen noise. We show that comma selection (the \({(1,\lambda )}\) EA) is faster than plus selection (the \({(1+\lambda )}\) EA) on this benchmark, in a fixed-target scenario, and for offspring population sizes \(\lambda \) for which both algorithms behave differently. For certain parameters, the \({(1,\lambda )}\) EAfinds the target in \(\Theta (n \ln n)\) evaluations, with high probability (w.h.p.), while the \({(1+\lambda )}\) EAw.h.p. requires \(\omega (n^2)\) evaluations. We further show that the advantage of comma selection is not arbitrarily large: w.h.p. comma selection outperforms plus selection at most by a factor of \(O(n \ln n)\) for most reasonable parameter choices. We develop novel methods for analysing frozen noise and give powerful and general fixed-target results with tail bounds that are of independent interest.
期刊介绍:
Algorithmica is an international journal which publishes theoretical papers on algorithms that address problems arising in practical areas, and experimental papers of general appeal for practical importance or techniques. The development of algorithms is an integral part of computer science. The increasing complexity and scope of computer applications makes the design of efficient algorithms essential.
Algorithmica covers algorithms in applied areas such as: VLSI, distributed computing, parallel processing, automated design, robotics, graphics, data base design, software tools, as well as algorithms in fundamental areas such as sorting, searching, data structures, computational geometry, and linear programming.
In addition, the journal features two special sections: Application Experience, presenting findings obtained from applications of theoretical results to practical situations, and Problems, offering short papers presenting problems on selected topics of computer science.