Mai Sun;Chaoli Sun;Xiaobo Li;Guochen Zhang;Farooq Akhtar
{"title":"Large-Scale Expensive Optimization with a Switching Strategy","authors":"Mai Sun;Chaoli Sun;Xiaobo Li;Guochen Zhang;Farooq Akhtar","doi":"10.23919/CSMS.2022.0013","DOIUrl":null,"url":null,"abstract":"Some optimization problems in scientific research, such as the robustness optimization for the Internet of Things and the neural architecture search, are large-scale in decision space and expensive for objective evaluation. In order to get a good solution in a limited budget for the large-scale expensive optimization, a random grouping strategy is adopted to divide the problem into some low-dimensional sub-problems. A surrogate model is then trained for each sub-problem using different strategies to select training data adaptively. After that, a dynamic infill criterion is proposed corresponding to the models currently used in the surrogate-assisted sub-problem optimization. Furthermore, an escape mechanism is proposed to keep the diversity of the population. The performance of the method is evaluated on CEC'2013 benchmark functions. Experimental results show that the algorithm has better performance in solving expensive large-scale optimization problems.","PeriodicalId":65786,"journal":{"name":"复杂系统建模与仿真(英文)","volume":"2 3","pages":"253-263"},"PeriodicalIF":0.0000,"publicationDate":"2022-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/iel7/9420428/9906545/09906551.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"复杂系统建模与仿真(英文)","FirstCategoryId":"1089","ListUrlMain":"https://ieeexplore.ieee.org/document/9906551/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Some optimization problems in scientific research, such as the robustness optimization for the Internet of Things and the neural architecture search, are large-scale in decision space and expensive for objective evaluation. In order to get a good solution in a limited budget for the large-scale expensive optimization, a random grouping strategy is adopted to divide the problem into some low-dimensional sub-problems. A surrogate model is then trained for each sub-problem using different strategies to select training data adaptively. After that, a dynamic infill criterion is proposed corresponding to the models currently used in the surrogate-assisted sub-problem optimization. Furthermore, an escape mechanism is proposed to keep the diversity of the population. The performance of the method is evaluated on CEC'2013 benchmark functions. Experimental results show that the algorithm has better performance in solving expensive large-scale optimization problems.