随机优化的单时间尺度随机拟牛顿方法

IF 1.7 4区 数学 Q2 MATHEMATICS, APPLIED
Peng Wang, Detong Zhu
{"title":"随机优化的单时间尺度随机拟牛顿方法","authors":"Peng Wang, Detong Zhu","doi":"10.1080/00207160.2023.2269430","DOIUrl":null,"url":null,"abstract":"AbstractIn this paper, we propose a single timescale stochastic quasi-Newton method for solving the stochastic optimization problems. The objective function of the problem is a composition of two smooth functions and their derivatives are not available. The algorithm sets to approximate sequences to estimate the gradient of the composite objective function and the inner function. The matrix correction parameters are given in BFGS update form for avoiding the assumption that Hessian matrix of objective is positive definite. We show the global convergence of the algorithm. The algorithm achieves the complexity O(ϵ−1) to find an ϵ−approximate stationary point and ensure that the expectation of the squared norm of the gradient is smaller than the given accuracy tolerance ϵ. The numerical results of nonconvex binary classification problem using the support vector machine and a multicall classification problem using neural networks are reported to show the effectiveness of the algorithm.Keywords: stochastic optimizationquasi-Newton methodBFGS update techniquemachine learning2010: 49M3765K0590C3090C56DisclaimerAs a service to authors and researchers we are providing this version of an accepted manuscript (AM). Copyediting, typesetting, and review of the resulting proofs will be undertaken on this manuscript before final publication of the Version of Record (VoR). During production and pre-press, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal relate to these versions also. AcknowledgmentsThe author thanks the support of National Natural Science Foundation (11371253) and Hainan Natural Science Foundation (120MS029).","PeriodicalId":13911,"journal":{"name":"International Journal of Computer Mathematics","volume":null,"pages":null},"PeriodicalIF":1.7000,"publicationDate":"2023-10-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A single timescale stochastic quasi-Newton method for stochastic optimization\",\"authors\":\"Peng Wang, Detong Zhu\",\"doi\":\"10.1080/00207160.2023.2269430\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"AbstractIn this paper, we propose a single timescale stochastic quasi-Newton method for solving the stochastic optimization problems. The objective function of the problem is a composition of two smooth functions and their derivatives are not available. The algorithm sets to approximate sequences to estimate the gradient of the composite objective function and the inner function. The matrix correction parameters are given in BFGS update form for avoiding the assumption that Hessian matrix of objective is positive definite. We show the global convergence of the algorithm. The algorithm achieves the complexity O(ϵ−1) to find an ϵ−approximate stationary point and ensure that the expectation of the squared norm of the gradient is smaller than the given accuracy tolerance ϵ. The numerical results of nonconvex binary classification problem using the support vector machine and a multicall classification problem using neural networks are reported to show the effectiveness of the algorithm.Keywords: stochastic optimizationquasi-Newton methodBFGS update techniquemachine learning2010: 49M3765K0590C3090C56DisclaimerAs a service to authors and researchers we are providing this version of an accepted manuscript (AM). Copyediting, typesetting, and review of the resulting proofs will be undertaken on this manuscript before final publication of the Version of Record (VoR). During production and pre-press, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal relate to these versions also. AcknowledgmentsThe author thanks the support of National Natural Science Foundation (11371253) and Hainan Natural Science Foundation (120MS029).\",\"PeriodicalId\":13911,\"journal\":{\"name\":\"International Journal of Computer Mathematics\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.7000,\"publicationDate\":\"2023-10-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Computer Mathematics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1080/00207160.2023.2269430\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Computer Mathematics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/00207160.2023.2269430","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0

摘要

摘要本文提出了求解随机优化问题的单时间尺度随机拟牛顿方法。该问题的目标函数是两个光滑函数的组合,它们的导数是不可用的。该算法设置近似序列来估计复合目标函数和内函数的梯度。为避免物镜的黑森矩阵是正定的假设,矩阵修正参数以BFGS更新形式给出。我们证明了该算法的全局收敛性。该算法实现了复杂度O(λ−1),以找到一个近似的平衡点,并确保梯度的平方范数的期望小于给定的精度容差λ。用支持向量机和神经网络分别对非凸二值分类问题和多值分类问题进行了数值分析,结果表明了该算法的有效性。关键词:随机优化准牛顿方法dbfgs更新技术机器学习2010:49m3765k0590c3090c56免责声明作为对作者和研究人员的服务,我们提供此版本的接受稿件(AM)。在最终出版版本记录(VoR)之前,将对该手稿进行编辑、排版和审查。在制作和印前,可能会发现可能影响内容的错误,所有适用于期刊的法律免责声明也与这些版本有关。感谢国家自然科学基金(11371253)和海南省自然科学基金(120MS029)的支持。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A single timescale stochastic quasi-Newton method for stochastic optimization
AbstractIn this paper, we propose a single timescale stochastic quasi-Newton method for solving the stochastic optimization problems. The objective function of the problem is a composition of two smooth functions and their derivatives are not available. The algorithm sets to approximate sequences to estimate the gradient of the composite objective function and the inner function. The matrix correction parameters are given in BFGS update form for avoiding the assumption that Hessian matrix of objective is positive definite. We show the global convergence of the algorithm. The algorithm achieves the complexity O(ϵ−1) to find an ϵ−approximate stationary point and ensure that the expectation of the squared norm of the gradient is smaller than the given accuracy tolerance ϵ. The numerical results of nonconvex binary classification problem using the support vector machine and a multicall classification problem using neural networks are reported to show the effectiveness of the algorithm.Keywords: stochastic optimizationquasi-Newton methodBFGS update techniquemachine learning2010: 49M3765K0590C3090C56DisclaimerAs a service to authors and researchers we are providing this version of an accepted manuscript (AM). Copyediting, typesetting, and review of the resulting proofs will be undertaken on this manuscript before final publication of the Version of Record (VoR). During production and pre-press, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal relate to these versions also. AcknowledgmentsThe author thanks the support of National Natural Science Foundation (11371253) and Hainan Natural Science Foundation (120MS029).
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
3.60
自引率
0.00%
发文量
72
审稿时长
5 months
期刊介绍: International Journal of Computer Mathematics (IJCM) is a world-leading journal serving the community of researchers in numerical analysis and scientific computing from academia to industry. IJCM publishes original research papers of high scientific value in fields of computational mathematics with profound applications to science and engineering. IJCM welcomes papers on the analysis and applications of innovative computational strategies as well as those with rigorous explorations of cutting-edge techniques and concerns in computational mathematics. Topics IJCM considers include: • Numerical solutions of systems of partial differential equations • Numerical solution of systems or of multi-dimensional partial differential equations • Theory and computations of nonlocal modelling and fractional partial differential equations • Novel multi-scale modelling and computational strategies • Parallel computations • Numerical optimization and controls • Imaging algorithms and vision configurations • Computational stochastic processes and inverse problems • Stochastic partial differential equations, Monte Carlo simulations and uncertainty quantification • Computational finance and applications • Highly vibrant and robust algorithms, and applications in modern industries, including but not limited to multi-physics, economics and biomedicine. Papers discussing only variations or combinations of existing methods without significant new computational properties or analysis are not of interest to IJCM. Please note that research in the development of computer systems and theory of computing are not suitable for submission to IJCM. Please instead consider International Journal of Computer Mathematics: Computer Systems Theory (IJCM: CST) for your manuscript. Please note that any papers submitted relating to these fields will be transferred to IJCM:CST. Please ensure you submit your paper to the correct journal to save time reviewing and processing your work. Papers developed from Conference Proceedings Please note that papers developed from conference proceedings or previously published work must contain at least 40% new material and significantly extend or improve upon earlier research in order to be considered for IJCM.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信