{"title":"A Framework for Nonlinearly-Constrained Gradient-Enhanced Local Bayesian Optimization With Comparisons to Quasi-Newton Optimizers","authors":"André L. Marchildon, David W. Zingg","doi":"10.1002/nme.70118","DOIUrl":null,"url":null,"abstract":"<p>Bayesian optimization is a popular and versatile approach that is well suited to solve challenging optimization problems. Their popularity comes from their effective minimization of expensive function evaluations, their capability to leverage gradients, and their efficient use of noisy data. Bayesian optimizers have commonly been applied to global unconstrained problems, with limited development for many other classes of problems. In this article, two alternative methods are developed that enable rapid and deep convergence of nonlinearly-constrained local optimization problems using a Bayesian optimizer. The first method uses an exact augmented Lagrangian and the second augments the minimization of the acquisition function to contain additional constraints. Both of these methods can be applied to nonlinear equality constraints, unlike most previous methods developed for constrained Bayesian optimizers. The new methods are applied with a gradient-enhanced Bayesian optimizer and enable deeper convergence for three nonlinearly-constrained unimodal optimization problems than previously developed methods for constrained Bayesian optimization. In addition, both new methods enable the Bayesian optimizer to reach a desired tolerance with fewer function evaluations than popular quasi-Newton optimizers from SciPy and MATLAB for unimodal problems with 2 to 30 variables. The Bayesian optimizer had similar results using both methods. It is recommended that users first try using the second method, which adds constraints to the acquisition function minimization, since its parameters are more intuitive to tune for new problems.</p>","PeriodicalId":13699,"journal":{"name":"International Journal for Numerical Methods in Engineering","volume":"126 18","pages":""},"PeriodicalIF":2.9000,"publicationDate":"2025-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/nme.70118","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal for Numerical Methods in Engineering","FirstCategoryId":"5","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/nme.70118","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
Bayesian optimization is a popular and versatile approach that is well suited to solve challenging optimization problems. Their popularity comes from their effective minimization of expensive function evaluations, their capability to leverage gradients, and their efficient use of noisy data. Bayesian optimizers have commonly been applied to global unconstrained problems, with limited development for many other classes of problems. In this article, two alternative methods are developed that enable rapid and deep convergence of nonlinearly-constrained local optimization problems using a Bayesian optimizer. The first method uses an exact augmented Lagrangian and the second augments the minimization of the acquisition function to contain additional constraints. Both of these methods can be applied to nonlinear equality constraints, unlike most previous methods developed for constrained Bayesian optimizers. The new methods are applied with a gradient-enhanced Bayesian optimizer and enable deeper convergence for three nonlinearly-constrained unimodal optimization problems than previously developed methods for constrained Bayesian optimization. In addition, both new methods enable the Bayesian optimizer to reach a desired tolerance with fewer function evaluations than popular quasi-Newton optimizers from SciPy and MATLAB for unimodal problems with 2 to 30 variables. The Bayesian optimizer had similar results using both methods. It is recommended that users first try using the second method, which adds constraints to the acquisition function minimization, since its parameters are more intuitive to tune for new problems.
期刊介绍:
The International Journal for Numerical Methods in Engineering publishes original papers describing significant, novel developments in numerical methods that are applicable to engineering problems.
The Journal is known for welcoming contributions in a wide range of areas in computational engineering, including computational issues in model reduction, uncertainty quantification, verification and validation, inverse analysis and stochastic methods, optimisation, element technology, solution techniques and parallel computing, damage and fracture, mechanics at micro and nano-scales, low-speed fluid dynamics, fluid-structure interaction, electromagnetics, coupled diffusion phenomena, and error estimation and mesh generation. It is emphasized that this is by no means an exhaustive list, and particularly papers on multi-scale, multi-physics or multi-disciplinary problems, and on new, emerging topics are welcome.