{"title":"高维贝叶斯优化的预期坐标改进","authors":"Dawei Zhan","doi":"10.1016/j.swevo.2024.101745","DOIUrl":null,"url":null,"abstract":"<div><div>Bayesian optimization (BO) algorithm is very popular for solving low-dimensional expensive optimization problems. Extending Bayesian optimization to high dimension is a meaningful but challenging task. One of the major challenges is that it is difficult to find good infill solutions as the acquisition functions are also high-dimensional. In this work, we propose the expected coordinate improvement (ECI) criterion for high-dimensional Bayesian optimization. The proposed ECI criterion measures the potential improvement we can get by moving the current best solution along one coordinate. The proposed approach selects the coordinate with the highest ECI value to refine in each iteration and covers all the coordinates gradually by iterating over the coordinates. The greatest advantage of the proposed ECI-BO (expected coordinate improvement based Bayesian optimization) algorithm over the standard BO algorithm is that the infill selection problem of the proposed algorithm is always a one-dimensional problem thus can be easily solved. Numerical experiments show that the proposed algorithm can achieve significantly better results than the standard BO algorithm and competitive results when compared with five high-dimensional BOs and six surrogate-assisted evolutionary algorithms. This work provides a simple but efficient approach for high-dimensional Bayesian optimization. A Matlab implementation of our ECI-BO is available at <span><span>https://github.com/zhandawei/Expected_Coordinate_Improvement</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":48682,"journal":{"name":"Swarm and Evolutionary Computation","volume":"91 ","pages":"Article 101745"},"PeriodicalIF":8.2000,"publicationDate":"2024-10-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Expected coordinate improvement for high-dimensional Bayesian optimization\",\"authors\":\"Dawei Zhan\",\"doi\":\"10.1016/j.swevo.2024.101745\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Bayesian optimization (BO) algorithm is very popular for solving low-dimensional expensive optimization problems. Extending Bayesian optimization to high dimension is a meaningful but challenging task. One of the major challenges is that it is difficult to find good infill solutions as the acquisition functions are also high-dimensional. In this work, we propose the expected coordinate improvement (ECI) criterion for high-dimensional Bayesian optimization. The proposed ECI criterion measures the potential improvement we can get by moving the current best solution along one coordinate. The proposed approach selects the coordinate with the highest ECI value to refine in each iteration and covers all the coordinates gradually by iterating over the coordinates. The greatest advantage of the proposed ECI-BO (expected coordinate improvement based Bayesian optimization) algorithm over the standard BO algorithm is that the infill selection problem of the proposed algorithm is always a one-dimensional problem thus can be easily solved. Numerical experiments show that the proposed algorithm can achieve significantly better results than the standard BO algorithm and competitive results when compared with five high-dimensional BOs and six surrogate-assisted evolutionary algorithms. This work provides a simple but efficient approach for high-dimensional Bayesian optimization. A Matlab implementation of our ECI-BO is available at <span><span>https://github.com/zhandawei/Expected_Coordinate_Improvement</span><svg><path></path></svg></span>.</div></div>\",\"PeriodicalId\":48682,\"journal\":{\"name\":\"Swarm and Evolutionary Computation\",\"volume\":\"91 \",\"pages\":\"Article 101745\"},\"PeriodicalIF\":8.2000,\"publicationDate\":\"2024-10-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Swarm and Evolutionary Computation\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2210650224002839\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Swarm and Evolutionary Computation","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2210650224002839","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
摘要
贝叶斯优化(BO)算法在解决低维昂贵的优化问题方面非常流行。将贝叶斯优化扩展到高维是一项有意义但极具挑战性的任务。其中一个主要挑战是,由于获取函数也是高维的,因此很难找到好的填充解决方案。在这项工作中,我们提出了用于高维贝叶斯优化的预期坐标改进(ECI)准则。所提出的 ECI 准则衡量的是我们通过沿一个坐标移动当前最佳解决方案所能获得的潜在改进。建议的方法在每次迭代中选择 ECI 值最高的坐标进行改进,并通过坐标迭代逐步覆盖所有坐标。与标准的贝叶斯优化算法相比,拟议的 ECI-BO 算法(基于预期坐标改进的贝叶斯优化算法)的最大优势在于,拟议算法的填充选择问题始终是一个一维问题,因此很容易解决。数值实验表明,所提算法的结果明显优于标准 BO 算法,与五种高维 BO 算法和六种代理辅助进化算法相比,所提算法的结果也很有竞争力。这项工作为高维贝叶斯优化提供了一种简单而高效的方法。我们的 ECI-BO 的 Matlab 实现可在 https://github.com/zhandawei/Expected_Coordinate_Improvement 上查阅。
Expected coordinate improvement for high-dimensional Bayesian optimization
Bayesian optimization (BO) algorithm is very popular for solving low-dimensional expensive optimization problems. Extending Bayesian optimization to high dimension is a meaningful but challenging task. One of the major challenges is that it is difficult to find good infill solutions as the acquisition functions are also high-dimensional. In this work, we propose the expected coordinate improvement (ECI) criterion for high-dimensional Bayesian optimization. The proposed ECI criterion measures the potential improvement we can get by moving the current best solution along one coordinate. The proposed approach selects the coordinate with the highest ECI value to refine in each iteration and covers all the coordinates gradually by iterating over the coordinates. The greatest advantage of the proposed ECI-BO (expected coordinate improvement based Bayesian optimization) algorithm over the standard BO algorithm is that the infill selection problem of the proposed algorithm is always a one-dimensional problem thus can be easily solved. Numerical experiments show that the proposed algorithm can achieve significantly better results than the standard BO algorithm and competitive results when compared with five high-dimensional BOs and six surrogate-assisted evolutionary algorithms. This work provides a simple but efficient approach for high-dimensional Bayesian optimization. A Matlab implementation of our ECI-BO is available at https://github.com/zhandawei/Expected_Coordinate_Improvement.
期刊介绍:
Swarm and Evolutionary Computation is a pioneering peer-reviewed journal focused on the latest research and advancements in nature-inspired intelligent computation using swarm and evolutionary algorithms. It covers theoretical, experimental, and practical aspects of these paradigms and their hybrids, promoting interdisciplinary research. The journal prioritizes the publication of high-quality, original articles that push the boundaries of evolutionary computation and swarm intelligence. Additionally, it welcomes survey papers on current topics and novel applications. Topics of interest include but are not limited to: Genetic Algorithms, and Genetic Programming, Evolution Strategies, and Evolutionary Programming, Differential Evolution, Artificial Immune Systems, Particle Swarms, Ant Colony, Bacterial Foraging, Artificial Bees, Fireflies Algorithm, Harmony Search, Artificial Life, Digital Organisms, Estimation of Distribution Algorithms, Stochastic Diffusion Search, Quantum Computing, Nano Computing, Membrane Computing, Human-centric Computing, Hybridization of Algorithms, Memetic Computing, Autonomic Computing, Self-organizing systems, Combinatorial, Discrete, Binary, Constrained, Multi-objective, Multi-modal, Dynamic, and Large-scale Optimization.