{"title":"Differential evolution on the CEC-2013 single-objective continuous optimization testbed","authors":"A. K. Qin, Xiaodong Li","doi":"10.1109/CEC.2013.6557689","DOIUrl":null,"url":null,"abstract":"Differential evolution (DE) is one of the most powerful continuous optimizers in the field of evolutionary computation. This work systematically benchmarks a classic DE algorithm (DE/rand/1/bin) on the CEC-2013 single-objective continuous optimization testbed. We report, for each test function at different problem dimensionality, the best achieved performance among a wide range of potentially effective parameter settings. It reflects the intrinsic optimization capability of DE/rand/1/bin on this testbed and can serve as a baseline for performance comparison in future research using this testbed. Furthermore, we conduct parameter sensitivity analysis using advanced non-parametric statistical tests to discover statistically significantly superior parameter settings. This analysis provides a statistically reliable rule of thumb for choosing the parameters of DE/rand/1/bin to solve unseen problems. Moreover, we report the performance of DE/rand/1/bin using one superior parameter setting advocated by parameter sensitivity analysis.","PeriodicalId":211988,"journal":{"name":"2013 IEEE Congress on Evolutionary Computation","volume":"405 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"45","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 IEEE Congress on Evolutionary Computation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CEC.2013.6557689","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 45
Abstract
Differential evolution (DE) is one of the most powerful continuous optimizers in the field of evolutionary computation. This work systematically benchmarks a classic DE algorithm (DE/rand/1/bin) on the CEC-2013 single-objective continuous optimization testbed. We report, for each test function at different problem dimensionality, the best achieved performance among a wide range of potentially effective parameter settings. It reflects the intrinsic optimization capability of DE/rand/1/bin on this testbed and can serve as a baseline for performance comparison in future research using this testbed. Furthermore, we conduct parameter sensitivity analysis using advanced non-parametric statistical tests to discover statistically significantly superior parameter settings. This analysis provides a statistically reliable rule of thumb for choosing the parameters of DE/rand/1/bin to solve unseen problems. Moreover, we report the performance of DE/rand/1/bin using one superior parameter setting advocated by parameter sensitivity analysis.