The limitation of neural nets for approximation and optimization

IF 1.8 3区 数学 Q1 Mathematics
T. Giovannelli, O. Sohab, L. N. Vicente
{"title":"The limitation of neural nets for approximation and optimization","authors":"T. Giovannelli, O. Sohab, L. N. Vicente","doi":"10.1007/s10898-024-01426-9","DOIUrl":null,"url":null,"abstract":"<p>We are interested in assessing the use of neural networks as surrogate models to approximate and minimize objective functions in optimization problems. While neural networks are widely used for machine learning tasks such as classification and regression, their application in solving optimization problems has been limited. Our study begins by determining the best activation function for approximating the objective functions of popular nonlinear optimization test problems, and the evidence provided shows that ReLU and SiLU exhibit the best performance on both training and testing data. We then analyze the accuracy of function value, gradient, and Hessian approximations for such objective functions obtained through interpolation/regression models and neural networks. When compared to interpolation/regression models, neural networks can deliver competitive zero- and first-order approximations (at a high training cost) but underperform on second-order approximation. However, it is shown that combining a neural net activation function with the natural basis for quadratic interpolation/regression can waive the necessity of including cross terms in the natural basis, leading to models with fewer parameters to determine. Lastly, we provide evidence that the performance of a state-of-the-art derivative-free optimization algorithm can hardly be improved when the gradient of an objective function is approximated using any of the surrogate models considered, including neural networks.</p>","PeriodicalId":15961,"journal":{"name":"Journal of Global Optimization","volume":null,"pages":null},"PeriodicalIF":1.8000,"publicationDate":"2024-08-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Global Optimization","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s10898-024-01426-9","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Mathematics","Score":null,"Total":0}
引用次数: 0

Abstract

We are interested in assessing the use of neural networks as surrogate models to approximate and minimize objective functions in optimization problems. While neural networks are widely used for machine learning tasks such as classification and regression, their application in solving optimization problems has been limited. Our study begins by determining the best activation function for approximating the objective functions of popular nonlinear optimization test problems, and the evidence provided shows that ReLU and SiLU exhibit the best performance on both training and testing data. We then analyze the accuracy of function value, gradient, and Hessian approximations for such objective functions obtained through interpolation/regression models and neural networks. When compared to interpolation/regression models, neural networks can deliver competitive zero- and first-order approximations (at a high training cost) but underperform on second-order approximation. However, it is shown that combining a neural net activation function with the natural basis for quadratic interpolation/regression can waive the necessity of including cross terms in the natural basis, leading to models with fewer parameters to determine. Lastly, we provide evidence that the performance of a state-of-the-art derivative-free optimization algorithm can hardly be improved when the gradient of an objective function is approximated using any of the surrogate models considered, including neural networks.

Abstract Image

神经网络在近似和优化方面的局限性
我们有兴趣评估在优化问题中使用神经网络作为近似和最小化目标函数的代理模型。虽然神经网络被广泛用于分类和回归等机器学习任务,但其在解决优化问题方面的应用却很有限。我们的研究首先确定了近似常用非线性优化测试问题目标函数的最佳激活函数,所提供的证据表明,ReLU 和 SiLU 在训练和测试数据上都表现出最佳性能。然后,我们分析了通过插值/回归模型和神经网络获得的此类目标函数的函数值、梯度和赫塞斯近似值的准确性。与插值/回归模型相比,神经网络可以提供有竞争力的零阶和一阶近似(训练成本较高),但在二阶近似方面表现不佳。不过,研究表明,将神经网络激活函数与二次插值/回归的自然基相结合,可以免除在自然基中加入交叉项的必要性,从而减少模型需要确定的参数。最后,我们提供的证据表明,当使用包括神经网络在内的任何代用模型对目标函数梯度进行逼近时,最先进的无导数优化算法的性能很难得到改善。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Journal of Global Optimization
Journal of Global Optimization 数学-应用数学
CiteScore
0.10
自引率
5.60%
发文量
137
审稿时长
6 months
期刊介绍: The Journal of Global Optimization publishes carefully refereed papers that encompass theoretical, computational, and applied aspects of global optimization. While the focus is on original research contributions dealing with the search for global optima of non-convex, multi-extremal problems, the journal’s scope covers optimization in the widest sense, including nonlinear, mixed integer, combinatorial, stochastic, robust, multi-objective optimization, computational geometry, and equilibrium problems. Relevant works on data-driven methods and optimization-based data mining are of special interest. In addition to papers covering theory and algorithms of global optimization, the journal publishes significant papers on numerical experiments, new testbeds, and applications in engineering, management, and the sciences. Applications of particular interest include healthcare, computational biochemistry, energy systems, telecommunications, and finance. Apart from full-length articles, the journal features short communications on both open and solved global optimization problems. It also offers reviews of relevant books and publishes special issues.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信