Joint entropy search for multi-objective Bayesian optimization with constraints and multiple fidelities

IF 6.5 2区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Daniel Fernández-Sánchez, Daniel Hernández-Lobato
{"title":"Joint entropy search for multi-objective Bayesian optimization with constraints and multiple fidelities","authors":"Daniel Fernández-Sánchez,&nbsp;Daniel Hernández-Lobato","doi":"10.1016/j.neucom.2025.131674","DOIUrl":null,"url":null,"abstract":"<div><div>Bayesian optimization (BO) methods can be used to solve efficiently problems with several objectives and constraints. Each objective and constraint is considered a black-box function that is expensive to evaluate, lacking a closed-form expression. BO methods use a model of each black-box to guide the search for the problem’s solution. Specifically, they make intelligent decisions about where each black-box function should be evaluated next with the goal of finding the solution using a few evaluations only. Sometimes, however, the black-boxes may be evaluated at different fidelity levels. A lower fidelity is simply a cheap proxy for the corresponding black-box. These lower fidelities correlate with the actual black-boxes to optimize and can, therefore, be used to reduce the overall cost of solving the optimization problem. Here, we propose Multi-fidelity Joint Entropy Search for Multi-objective Bayesian Optimization with Constraints (MF-JESMOC), a BO method for solving the aforementioned problems. MF-JESMOC chooses the next point, and fidelity level at which to evaluate the black-boxes, as the combination that is expected to reduce the most the joint entropy of the Pareto set and the Pareto front, normalized by the fidelity’s evaluation cost. We use Deep Gaussian processes to model each black-box and the dependencies between fidelities. These are powerful probabilistic models that can learn the dependency structure among fidelity levels of each black-box. Several experiments show that MF-JESMOC outperforms other state-of-the-art methods for multi-objective BO with constraints and different fidelity levels in both synthetic and real-world problems.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"657 ","pages":"Article 131674"},"PeriodicalIF":6.5000,"publicationDate":"2025-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S092523122502346X","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Bayesian optimization (BO) methods can be used to solve efficiently problems with several objectives and constraints. Each objective and constraint is considered a black-box function that is expensive to evaluate, lacking a closed-form expression. BO methods use a model of each black-box to guide the search for the problem’s solution. Specifically, they make intelligent decisions about where each black-box function should be evaluated next with the goal of finding the solution using a few evaluations only. Sometimes, however, the black-boxes may be evaluated at different fidelity levels. A lower fidelity is simply a cheap proxy for the corresponding black-box. These lower fidelities correlate with the actual black-boxes to optimize and can, therefore, be used to reduce the overall cost of solving the optimization problem. Here, we propose Multi-fidelity Joint Entropy Search for Multi-objective Bayesian Optimization with Constraints (MF-JESMOC), a BO method for solving the aforementioned problems. MF-JESMOC chooses the next point, and fidelity level at which to evaluate the black-boxes, as the combination that is expected to reduce the most the joint entropy of the Pareto set and the Pareto front, normalized by the fidelity’s evaluation cost. We use Deep Gaussian processes to model each black-box and the dependencies between fidelities. These are powerful probabilistic models that can learn the dependency structure among fidelity levels of each black-box. Several experiments show that MF-JESMOC outperforms other state-of-the-art methods for multi-objective BO with constraints and different fidelity levels in both synthetic and real-world problems.
约束多保真度多目标贝叶斯优化的联合熵搜索
贝叶斯优化(BO)方法可以有效地求解具有多个目标和约束的问题。每个目标和约束都被认为是计算成本很高的黑盒函数,缺乏封闭形式的表达式。BO方法使用每个黑盒的模型来指导对问题解决方案的搜索。具体来说,他们会做出明智的决定,决定接下来应该在哪里评估每个黑盒函数,目标是仅使用少量评估就找到解决方案。然而,有时黑盒可能在不同的保真度水平上进行评估。较低的保真度只是对应黑盒的廉价代理。这些较低的保真度与要优化的实际黑盒相关,因此可以用于减少解决优化问题的总成本。在此,我们提出了多保真度联合熵搜索约束多目标贝叶斯优化(MF-JESMOC),这是一种解决上述问题的BO方法。MF-JESMOC选择下一个点和评估黑盒的保真度水平作为期望最大程度地减少帕雷托集和帕雷托前沿联合熵的组合,并通过保真度评估成本进行归一化。我们使用深度高斯过程对每个黑盒和保真度之间的依赖关系进行建模。这些是强大的概率模型,可以学习每个黑箱保真度级别之间的依赖结构。多个实验表明,MF-JESMOC在综合和现实问题中都优于其他具有约束和不同保真度的多目标BO方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Neurocomputing
Neurocomputing 工程技术-计算机:人工智能
CiteScore
13.10
自引率
10.00%
发文量
1382
审稿时长
70 days
期刊介绍: Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信