{"title":"基于随机字典的自适应阶跃搜索的高概率复杂性边界","authors":"Billy Jin, Katya Scheinberg, Miaolan Xie","doi":"10.1137/22m1512764","DOIUrl":null,"url":null,"abstract":"SIAM Journal on Optimization, Volume 34, Issue 3, Page 2411-2439, September 2024. <br/> Abstract. We consider a step search method for continuous optimization under a stochastic setting where the function values and gradients are available only through inexact probabilistic zeroth- and first-order oracles. (We introduce the term step search for a class of methods, similar to line search, but where step direction can change during the back-tracking procedure.) Unlike the stochastic gradient method and its many variants, the algorithm does not use a prespecified sequence of step sizes but increases or decreases the step size adaptively according to the estimated progress of the algorithm. These oracles capture multiple standard settings including expected loss minimization and zeroth-order optimization. Moreover, our framework is very general and allows the function and gradient estimates to be biased. The proposed algorithm is simple to describe and easy to implement. Under fairly general conditions on the oracles, we derive a high probability tail bound on the iteration complexity of the algorithm when it is applied to nonconvex, convex, and strongly convex (more generally, those satisfying the Polyak-Łojasiewicz (PL) condition) functions. Our analysis strengthens and extends prior results for stochastic step and line search methods.","PeriodicalId":49529,"journal":{"name":"SIAM Journal on Optimization","volume":null,"pages":null},"PeriodicalIF":2.6000,"publicationDate":"2024-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"High Probability Complexity Bounds for Adaptive Step Search Based on Stochastic Oracles\",\"authors\":\"Billy Jin, Katya Scheinberg, Miaolan Xie\",\"doi\":\"10.1137/22m1512764\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"SIAM Journal on Optimization, Volume 34, Issue 3, Page 2411-2439, September 2024. <br/> Abstract. We consider a step search method for continuous optimization under a stochastic setting where the function values and gradients are available only through inexact probabilistic zeroth- and first-order oracles. (We introduce the term step search for a class of methods, similar to line search, but where step direction can change during the back-tracking procedure.) Unlike the stochastic gradient method and its many variants, the algorithm does not use a prespecified sequence of step sizes but increases or decreases the step size adaptively according to the estimated progress of the algorithm. These oracles capture multiple standard settings including expected loss minimization and zeroth-order optimization. Moreover, our framework is very general and allows the function and gradient estimates to be biased. The proposed algorithm is simple to describe and easy to implement. Under fairly general conditions on the oracles, we derive a high probability tail bound on the iteration complexity of the algorithm when it is applied to nonconvex, convex, and strongly convex (more generally, those satisfying the Polyak-Łojasiewicz (PL) condition) functions. Our analysis strengthens and extends prior results for stochastic step and line search methods.\",\"PeriodicalId\":49529,\"journal\":{\"name\":\"SIAM Journal on Optimization\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.6000,\"publicationDate\":\"2024-07-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"SIAM Journal on Optimization\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1137/22m1512764\",\"RegionNum\":1,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIAM Journal on Optimization","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1137/22m1512764","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
High Probability Complexity Bounds for Adaptive Step Search Based on Stochastic Oracles
SIAM Journal on Optimization, Volume 34, Issue 3, Page 2411-2439, September 2024. Abstract. We consider a step search method for continuous optimization under a stochastic setting where the function values and gradients are available only through inexact probabilistic zeroth- and first-order oracles. (We introduce the term step search for a class of methods, similar to line search, but where step direction can change during the back-tracking procedure.) Unlike the stochastic gradient method and its many variants, the algorithm does not use a prespecified sequence of step sizes but increases or decreases the step size adaptively according to the estimated progress of the algorithm. These oracles capture multiple standard settings including expected loss minimization and zeroth-order optimization. Moreover, our framework is very general and allows the function and gradient estimates to be biased. The proposed algorithm is simple to describe and easy to implement. Under fairly general conditions on the oracles, we derive a high probability tail bound on the iteration complexity of the algorithm when it is applied to nonconvex, convex, and strongly convex (more generally, those satisfying the Polyak-Łojasiewicz (PL) condition) functions. Our analysis strengthens and extends prior results for stochastic step and line search methods.
期刊介绍:
The SIAM Journal on Optimization contains research articles on the theory and practice of optimization. The areas addressed include linear and quadratic programming, convex programming, nonlinear programming, complementarity problems, stochastic optimization, combinatorial optimization, integer programming, and convex, nonsmooth and variational analysis. Contributions may emphasize optimization theory, algorithms, software, computational practice, applications, or the links between these subjects.