A Framework for Nonlinearly-Constrained Gradient-Enhanced Local Bayesian Optimization With Comparisons to Quasi-Newton Optimizers

IF 2.9 3区 工程技术 Q1 ENGINEERING, MULTIDISCIPLINARY
André L. Marchildon, David W. Zingg
{"title":"A Framework for Nonlinearly-Constrained Gradient-Enhanced Local Bayesian Optimization With Comparisons to Quasi-Newton Optimizers","authors":"André L. Marchildon,&nbsp;David W. Zingg","doi":"10.1002/nme.70118","DOIUrl":null,"url":null,"abstract":"<p>Bayesian optimization is a popular and versatile approach that is well suited to solve challenging optimization problems. Their popularity comes from their effective minimization of expensive function evaluations, their capability to leverage gradients, and their efficient use of noisy data. Bayesian optimizers have commonly been applied to global unconstrained problems, with limited development for many other classes of problems. In this article, two alternative methods are developed that enable rapid and deep convergence of nonlinearly-constrained local optimization problems using a Bayesian optimizer. The first method uses an exact augmented Lagrangian and the second augments the minimization of the acquisition function to contain additional constraints. Both of these methods can be applied to nonlinear equality constraints, unlike most previous methods developed for constrained Bayesian optimizers. The new methods are applied with a gradient-enhanced Bayesian optimizer and enable deeper convergence for three nonlinearly-constrained unimodal optimization problems than previously developed methods for constrained Bayesian optimization. In addition, both new methods enable the Bayesian optimizer to reach a desired tolerance with fewer function evaluations than popular quasi-Newton optimizers from SciPy and MATLAB for unimodal problems with 2 to 30 variables. The Bayesian optimizer had similar results using both methods. It is recommended that users first try using the second method, which adds constraints to the acquisition function minimization, since its parameters are more intuitive to tune for new problems.</p>","PeriodicalId":13699,"journal":{"name":"International Journal for Numerical Methods in Engineering","volume":"126 18","pages":""},"PeriodicalIF":2.9000,"publicationDate":"2025-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/nme.70118","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal for Numerical Methods in Engineering","FirstCategoryId":"5","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/nme.70118","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

Abstract

Bayesian optimization is a popular and versatile approach that is well suited to solve challenging optimization problems. Their popularity comes from their effective minimization of expensive function evaluations, their capability to leverage gradients, and their efficient use of noisy data. Bayesian optimizers have commonly been applied to global unconstrained problems, with limited development for many other classes of problems. In this article, two alternative methods are developed that enable rapid and deep convergence of nonlinearly-constrained local optimization problems using a Bayesian optimizer. The first method uses an exact augmented Lagrangian and the second augments the minimization of the acquisition function to contain additional constraints. Both of these methods can be applied to nonlinear equality constraints, unlike most previous methods developed for constrained Bayesian optimizers. The new methods are applied with a gradient-enhanced Bayesian optimizer and enable deeper convergence for three nonlinearly-constrained unimodal optimization problems than previously developed methods for constrained Bayesian optimization. In addition, both new methods enable the Bayesian optimizer to reach a desired tolerance with fewer function evaluations than popular quasi-Newton optimizers from SciPy and MATLAB for unimodal problems with 2 to 30 variables. The Bayesian optimizer had similar results using both methods. It is recommended that users first try using the second method, which adds constraints to the acquisition function minimization, since its parameters are more intuitive to tune for new problems.

Abstract Image

非线性约束梯度增强局部贝叶斯优化框架及其与拟牛顿优化器的比较
贝叶斯优化是一种流行且通用的方法,非常适合解决具有挑战性的优化问题。它们的受欢迎程度来自于它们有效地最小化昂贵的函数求值、它们利用梯度的能力以及它们对噪声数据的有效利用。贝叶斯优化器通常应用于全局无约束问题,在许多其他类型的问题上发展有限。在本文中,开发了两种替代方法,使使用贝叶斯优化器的非线性约束局部优化问题能够快速和深度收敛。第一种方法使用精确增广拉格朗日量,第二种方法增加获取函数的最小化以包含额外的约束。这两种方法都可以应用于非线性等式约束,不像大多数以前为约束贝叶斯优化器开发的方法。新方法采用梯度增强贝叶斯优化器,与以前开发的约束贝叶斯优化方法相比,对三个非线性约束单峰优化问题具有更深的收敛性。此外,对于2到30个变量的单峰问题,这两种新方法都能使贝叶斯优化器达到所需的容差,比来自SciPy和MATLAB的流行准牛顿优化器的函数求值更少。贝叶斯优化器使用这两种方法得到了相似的结果。建议用户首先尝试使用第二种方法,这种方法为获取函数最小化增加了约束,因为它的参数更直观地调整新问题。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
5.70
自引率
6.90%
发文量
276
审稿时长
5.3 months
期刊介绍: The International Journal for Numerical Methods in Engineering publishes original papers describing significant, novel developments in numerical methods that are applicable to engineering problems. The Journal is known for welcoming contributions in a wide range of areas in computational engineering, including computational issues in model reduction, uncertainty quantification, verification and validation, inverse analysis and stochastic methods, optimisation, element technology, solution techniques and parallel computing, damage and fracture, mechanics at micro and nano-scales, low-speed fluid dynamics, fluid-structure interaction, electromagnetics, coupled diffusion phenomena, and error estimation and mesh generation. It is emphasized that this is by no means an exhaustive list, and particularly papers on multi-scale, multi-physics or multi-disciplinary problems, and on new, emerging topics are welcome.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信