{"title":"A Gradient Complexity Analysis for Minimizing the Sum of Strongly Convex Functions with Varying Condition Numbers","authors":"Nuozhou Wang, Shuzhong Zhang","doi":"10.1137/22m1503646","DOIUrl":null,"url":null,"abstract":"SIAM Journal on Optimization, Volume 34, Issue 2, Page 1374-1401, June 2024. <br/> Abstract. A popular approach to minimizing a finite sum of smooth convex functions is stochastic gradient descent (SGD) and its variants. Fundamental research questions associated with SGD include (i) how to find a lower bound on the number of times that the gradient oracle of each individual function must be assessed in order to find an [math]-minimizer of the overall objective; (ii) how to design algorithms which guarantee finding an [math]-minimizer of the overall objective in expectation no more than a certain number of times (in terms of [math]) that the gradient oracle of each function needs to be assessed (i.e., upper bound). If these two bounds are at the same order of magnitude, then the algorithms may be called optimal. Most existing results along this line of research typically assume that the functions in the objective share the same condition number. In this paper, the first model we study is the problem of minimizing the sum of finitely many strongly convex functions whose condition numbers are all different. We propose an SGD-based method for this model and show that it is optimal in gradient computations, up to a logarithmic factor. We then consider a constrained separate block optimization model and present lower and upper bounds for its gradient computation complexity. Next, we propose solving the Fenchel dual of the constrained block optimization model via generalized SSNM, which we introduce earlier, and show that it yields a lower iteration complexity than solving the original model by the ADMM-type approach. Finally, we extend the analysis to the general composite convex optimization model and obtain gradient-computation complexity results under certain conditions.","PeriodicalId":49529,"journal":{"name":"SIAM Journal on Optimization","volume":null,"pages":null},"PeriodicalIF":2.6000,"publicationDate":"2024-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIAM Journal on Optimization","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1137/22m1503646","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
SIAM Journal on Optimization, Volume 34, Issue 2, Page 1374-1401, June 2024. Abstract. A popular approach to minimizing a finite sum of smooth convex functions is stochastic gradient descent (SGD) and its variants. Fundamental research questions associated with SGD include (i) how to find a lower bound on the number of times that the gradient oracle of each individual function must be assessed in order to find an [math]-minimizer of the overall objective; (ii) how to design algorithms which guarantee finding an [math]-minimizer of the overall objective in expectation no more than a certain number of times (in terms of [math]) that the gradient oracle of each function needs to be assessed (i.e., upper bound). If these two bounds are at the same order of magnitude, then the algorithms may be called optimal. Most existing results along this line of research typically assume that the functions in the objective share the same condition number. In this paper, the first model we study is the problem of minimizing the sum of finitely many strongly convex functions whose condition numbers are all different. We propose an SGD-based method for this model and show that it is optimal in gradient computations, up to a logarithmic factor. We then consider a constrained separate block optimization model and present lower and upper bounds for its gradient computation complexity. Next, we propose solving the Fenchel dual of the constrained block optimization model via generalized SSNM, which we introduce earlier, and show that it yields a lower iteration complexity than solving the original model by the ADMM-type approach. Finally, we extend the analysis to the general composite convex optimization model and obtain gradient-computation complexity results under certain conditions.
期刊介绍:
The SIAM Journal on Optimization contains research articles on the theory and practice of optimization. The areas addressed include linear and quadratic programming, convex programming, nonlinear programming, complementarity problems, stochastic optimization, combinatorial optimization, integer programming, and convex, nonsmooth and variational analysis. Contributions may emphasize optimization theory, algorithms, software, computational practice, applications, or the links between these subjects.