Dimitri J. Papageorgiou, Jan Kronqvist, Krishnan Kumaran
{"title":"Linewalker: line search for black box derivative-free optimization and surrogate model construction","authors":"Dimitri J. Papageorgiou, Jan Kronqvist, Krishnan Kumaran","doi":"10.1007/s11081-023-09879-9","DOIUrl":null,"url":null,"abstract":"<p>This paper describes a simple, but effective sampling method for optimizing and learning a discrete approximation (or surrogate) of a multi-dimensional function along a one-dimensional line segment of interest. The method does not rely on derivative information and the function to be learned can be a computationally-expensive “black box” function that must be queried via simulation or other means. It is assumed that the underlying function is noise-free and smooth, although the algorithm can still be effective when the underlying function is piecewise smooth. The method constructs a smooth surrogate on a set of equally-spaced grid points by evaluating the true function at a sparse set of judiciously chosen grid points. At each iteration, the surrogate’s non-tabu local minima and maxima are identified as candidates for sampling. Tabu search constructs are also used to promote diversification. If no non-tabu extrema are identified, a simple exploration step is taken by sampling the midpoint of the largest unexplored interval. The algorithm continues until a user-defined function evaluation limit is reached. Numerous examples are shown to illustrate the algorithm’s efficacy and superiority relative to state-of-the-art methods, including Bayesian optimization and NOMAD, on primarily nonconvex test functions.</p>","PeriodicalId":56141,"journal":{"name":"Optimization and Engineering","volume":"23 1","pages":""},"PeriodicalIF":2.0000,"publicationDate":"2024-02-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optimization and Engineering","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1007/s11081-023-09879-9","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
This paper describes a simple, but effective sampling method for optimizing and learning a discrete approximation (or surrogate) of a multi-dimensional function along a one-dimensional line segment of interest. The method does not rely on derivative information and the function to be learned can be a computationally-expensive “black box” function that must be queried via simulation or other means. It is assumed that the underlying function is noise-free and smooth, although the algorithm can still be effective when the underlying function is piecewise smooth. The method constructs a smooth surrogate on a set of equally-spaced grid points by evaluating the true function at a sparse set of judiciously chosen grid points. At each iteration, the surrogate’s non-tabu local minima and maxima are identified as candidates for sampling. Tabu search constructs are also used to promote diversification. If no non-tabu extrema are identified, a simple exploration step is taken by sampling the midpoint of the largest unexplored interval. The algorithm continues until a user-defined function evaluation limit is reached. Numerous examples are shown to illustrate the algorithm’s efficacy and superiority relative to state-of-the-art methods, including Bayesian optimization and NOMAD, on primarily nonconvex test functions.
期刊介绍:
Optimization and Engineering is a multidisciplinary journal; its primary goal is to promote the application of optimization methods in the general area of engineering sciences. We expect submissions to OPTE not only to make a significant optimization contribution but also to impact a specific engineering application.
Topics of Interest:
-Optimization: All methods and algorithms of mathematical optimization, including blackbox and derivative-free optimization, continuous optimization, discrete optimization, global optimization, linear and conic optimization, multiobjective optimization, PDE-constrained optimization & control, and stochastic optimization. Numerical and implementation issues, optimization software, benchmarking, and case studies.
-Engineering Sciences: Aerospace engineering, biomedical engineering, chemical & process engineering, civil, environmental, & architectural engineering, electrical engineering, financial engineering, geosciences, healthcare engineering, industrial & systems engineering, mechanical engineering & MDO, and robotics.