On the numerical performance of finite-difference-based methods for derivative-free optimization

H. Shi, M. Xuan, Figen Öztoprak, J. Nocedal
{"title":"On the numerical performance of finite-difference-based methods for derivative-free optimization","authors":"H. Shi, M. Xuan, Figen Öztoprak, J. Nocedal","doi":"10.1080/10556788.2022.2121832","DOIUrl":null,"url":null,"abstract":"The goal of this paper is to investigate an approach for derivative-free optimization that has not received sufficient attention in the literature and is yet one of the simplest to implement and parallelize. In its simplest form, it consists of employing derivative-based methods for unconstrained or constrained optimization and replacing the gradient of the objective (and constraints) by finite-difference approximations. This approach is applicable to problems with or without noise in the functions. The differencing interval is determined by a bound on the second (or third) derivative and by the noise level, which is assumed to be known or to be accessible through difference tables or sampling. The use of finite-difference gradient approximations has been largely dismissed in the derivative-free optimization literature as too expensive in terms of function evaluations or as impractical in the presence of noise. However, the test results presented in this paper suggest that it has much to be recommended. The experiments compare newuoa, dfo-ls and cobyla against finite-difference versions of l-bfgs, lmder and knitro on three classes of problems: general unconstrained problems, nonlinear least squares problems and nonlinear programs with inequality constraints.","PeriodicalId":124811,"journal":{"name":"Optimization Methods and Software","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optimization Methods and Software","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/10556788.2022.2121832","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

The goal of this paper is to investigate an approach for derivative-free optimization that has not received sufficient attention in the literature and is yet one of the simplest to implement and parallelize. In its simplest form, it consists of employing derivative-based methods for unconstrained or constrained optimization and replacing the gradient of the objective (and constraints) by finite-difference approximations. This approach is applicable to problems with or without noise in the functions. The differencing interval is determined by a bound on the second (or third) derivative and by the noise level, which is assumed to be known or to be accessible through difference tables or sampling. The use of finite-difference gradient approximations has been largely dismissed in the derivative-free optimization literature as too expensive in terms of function evaluations or as impractical in the presence of noise. However, the test results presented in this paper suggest that it has much to be recommended. The experiments compare newuoa, dfo-ls and cobyla against finite-difference versions of l-bfgs, lmder and knitro on three classes of problems: general unconstrained problems, nonlinear least squares problems and nonlinear programs with inequality constraints.
基于有限差分的无导数优化方法的数值性能研究
本文的目标是研究一种在文献中没有得到足够重视的无导数优化方法,但它是最容易实现和并行化的方法之一。在其最简单的形式中,它包括采用基于导数的方法进行无约束或有约束优化,并用有限差分近似代替目标(和约束)的梯度。这种方法适用于函数中有无噪声的问题。差分区间由二阶(或三阶)导数的边界和噪声水平决定,假设噪声水平是已知的,或者可以通过差分表或采样获得。在无导数优化文献中,有限差分梯度近似的使用在很大程度上被忽视了,因为在函数评估方面过于昂贵,或者在存在噪声的情况下不切实际。然而,本文提出的试验结果表明,它有很多值得推荐的地方。在一般无约束问题、非线性最小二乘问题和具有不等式约束的非线性规划这三类问题上,将newoa、dfo-ls和cobyla与有限差分版本的l-bfgs、lder和knitro进行了比较。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信