{"title":"一阶优化方法建模的随机微分方程","authors":"M. Dambrine, Ch. Dossal, B. Puig, A. Rondepierre","doi":"10.1137/21m1435665","DOIUrl":null,"url":null,"abstract":"SIAM Journal on Optimization, Volume 34, Issue 2, Page 1402-1426, June 2024. <br/>Abstract. In this article, a family of SDEs are derived as a tool to understand the behavior of numerical optimization methods under random evaluations of the gradient. Our objective is to transpose the introduction of continuous versions through ODEs to understand the asymptotic behavior of a discrete optimization scheme to the stochastic setting. We consider a continuous version of the stochastic gradient scheme and of a stochastic inertial system. This article first studies the quality of the approximation of the discrete scheme by an SDE when the step size tends to 0. Then, it presents new asymptotic bounds on the values [math], where [math] is a solution of the SDE and [math], when [math] is convex and under integrability conditions on the noise. Results are provided under two sets of hypotheses: first considering [math] and convex functions and then adding some geometrical properties of [math]. All of these results provide insight on the behavior of these inertial and perturbed algorithms in the setting of stochastic algorithms.","PeriodicalId":49529,"journal":{"name":"SIAM Journal on Optimization","volume":null,"pages":null},"PeriodicalIF":2.6000,"publicationDate":"2024-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Stochastic Differential Equations for Modeling First Order Optimization Methods\",\"authors\":\"M. Dambrine, Ch. Dossal, B. Puig, A. Rondepierre\",\"doi\":\"10.1137/21m1435665\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"SIAM Journal on Optimization, Volume 34, Issue 2, Page 1402-1426, June 2024. <br/>Abstract. In this article, a family of SDEs are derived as a tool to understand the behavior of numerical optimization methods under random evaluations of the gradient. Our objective is to transpose the introduction of continuous versions through ODEs to understand the asymptotic behavior of a discrete optimization scheme to the stochastic setting. We consider a continuous version of the stochastic gradient scheme and of a stochastic inertial system. This article first studies the quality of the approximation of the discrete scheme by an SDE when the step size tends to 0. Then, it presents new asymptotic bounds on the values [math], where [math] is a solution of the SDE and [math], when [math] is convex and under integrability conditions on the noise. Results are provided under two sets of hypotheses: first considering [math] and convex functions and then adding some geometrical properties of [math]. All of these results provide insight on the behavior of these inertial and perturbed algorithms in the setting of stochastic algorithms.\",\"PeriodicalId\":49529,\"journal\":{\"name\":\"SIAM Journal on Optimization\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.6000,\"publicationDate\":\"2024-04-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"SIAM Journal on Optimization\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1137/21m1435665\",\"RegionNum\":1,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIAM Journal on Optimization","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1137/21m1435665","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
Stochastic Differential Equations for Modeling First Order Optimization Methods
SIAM Journal on Optimization, Volume 34, Issue 2, Page 1402-1426, June 2024. Abstract. In this article, a family of SDEs are derived as a tool to understand the behavior of numerical optimization methods under random evaluations of the gradient. Our objective is to transpose the introduction of continuous versions through ODEs to understand the asymptotic behavior of a discrete optimization scheme to the stochastic setting. We consider a continuous version of the stochastic gradient scheme and of a stochastic inertial system. This article first studies the quality of the approximation of the discrete scheme by an SDE when the step size tends to 0. Then, it presents new asymptotic bounds on the values [math], where [math] is a solution of the SDE and [math], when [math] is convex and under integrability conditions on the noise. Results are provided under two sets of hypotheses: first considering [math] and convex functions and then adding some geometrical properties of [math]. All of these results provide insight on the behavior of these inertial and perturbed algorithms in the setting of stochastic algorithms.
期刊介绍:
The SIAM Journal on Optimization contains research articles on the theory and practice of optimization. The areas addressed include linear and quadratic programming, convex programming, nonlinear programming, complementarity problems, stochastic optimization, combinatorial optimization, integer programming, and convex, nonsmooth and variational analysis. Contributions may emphasize optimization theory, algorithms, software, computational practice, applications, or the links between these subjects.