Shiqiang Zhang, Robert M. Lee, Behrang Shafei, David Walz, Ruth Misener
{"title":"Dependence in constrained Bayesian optimization","authors":"Shiqiang Zhang, Robert M. Lee, Behrang Shafei, David Walz, Ruth Misener","doi":"10.1007/s11590-023-02047-z","DOIUrl":null,"url":null,"abstract":"Abstract Constrained Bayesian optimization optimizes a black-box objective function subject to black-box constraints. For simplicity, most existing works assume that multiple constraints are independent. To ask, when and how does dependence between constraints help? , we remove this assumption and implement probability of feasibility with dependence (Dep-PoF) by applying multiple output Gaussian processes (MOGPs) as surrogate models and using expectation propagation to approximate the probabilities. We compare Dep-PoF and the independent version PoF. We propose two new acquisition functions incorporating Dep-PoF and test them on synthetic and practical benchmarks. Our results are largely negative: incorporating dependence between the constraints does not help much. Empirically, incorporating dependence between constraints may be useful if: (i) the solution is on the boundary of the feasible region(s) or (ii) the feasible set is very small. When these conditions are satisfied, the predictive covariance matrix from the MOGP may be poorly approximated by a diagonal matrix and the off-diagonal matrix elements may become important. Dep-PoF may apply to settings where (i) the constraints and their dependence are totally unknown and (ii) experiments are so expensive that any slightly better Bayesian optimization procedure is preferred. But, in most cases, Dep-PoF is indistinguishable from PoF.","PeriodicalId":49720,"journal":{"name":"Optimization Letters","volume":"86 1","pages":"0"},"PeriodicalIF":1.1000,"publicationDate":"2023-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optimization Letters","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s11590-023-02047-z","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
Abstract Constrained Bayesian optimization optimizes a black-box objective function subject to black-box constraints. For simplicity, most existing works assume that multiple constraints are independent. To ask, when and how does dependence between constraints help? , we remove this assumption and implement probability of feasibility with dependence (Dep-PoF) by applying multiple output Gaussian processes (MOGPs) as surrogate models and using expectation propagation to approximate the probabilities. We compare Dep-PoF and the independent version PoF. We propose two new acquisition functions incorporating Dep-PoF and test them on synthetic and practical benchmarks. Our results are largely negative: incorporating dependence between the constraints does not help much. Empirically, incorporating dependence between constraints may be useful if: (i) the solution is on the boundary of the feasible region(s) or (ii) the feasible set is very small. When these conditions are satisfied, the predictive covariance matrix from the MOGP may be poorly approximated by a diagonal matrix and the off-diagonal matrix elements may become important. Dep-PoF may apply to settings where (i) the constraints and their dependence are totally unknown and (ii) experiments are so expensive that any slightly better Bayesian optimization procedure is preferred. But, in most cases, Dep-PoF is indistinguishable from PoF.
期刊介绍:
Optimization Letters is an international journal covering all aspects of optimization, including theory, algorithms, computational studies, and applications, and providing an outlet for rapid publication of short communications in the field. Originality, significance, quality and clarity are the essential criteria for choosing the material to be published.
Optimization Letters has been expanding in all directions at an astonishing rate during the last few decades. New algorithmic and theoretical techniques have been developed, the diffusion into other disciplines has proceeded at a rapid pace, and our knowledge of all aspects of the field has grown even more profound. At the same time one of the most striking trends in optimization is the constantly increasing interdisciplinary nature of the field.
Optimization Letters aims to communicate in a timely fashion all recent developments in optimization with concise short articles (limited to a total of ten journal pages). Such concise articles will be easily accessible by readers working in any aspects of optimization and wish to be informed of recent developments.