Nizar Bousselmi, Julien M. Hendrickx, François Glineur
{"title":"Interpolation Conditions for Linear Operators and Applications to Performance Estimation Problems","authors":"Nizar Bousselmi, Julien M. Hendrickx, François Glineur","doi":"10.1137/23m1575391","DOIUrl":null,"url":null,"abstract":"SIAM Journal on Optimization, Volume 34, Issue 3, Page 3033-3063, September 2024. <br/> Abstract. The performance estimation problem methodology makes it possible to determine the exact worst-case performance of an optimization method. In this work, we generalize this framework to first-order methods involving linear operators. This extension requires an explicit formulation of interpolation conditions for those linear operators. We consider the class of linear operators [math], where matrix [math] has bounded singular values, and the class of linear operators, where [math] is symmetric and has bounded eigenvalues. We describe interpolation conditions for these classes, i.e., necessary and sufficient conditions that, given a list of pairs [math], characterize the existence of a linear operator mapping [math] to [math] for all [math]. Using these conditions, we first identify the exact worst-case behavior of the gradient method applied to the composed objective [math], and observe that it always corresponds to [math] being a scaling operator. We then investigate the Chambolle–Pock method applied to [math], and improve the existing analysis to obtain a proof of the exact convergence rate of the primal-dual gap. In addition, we study how this method behaves on Lipschitz convex functions, and obtain a numerical convergence rate for the primal accuracy of the last iterate. We also show numerically that averaging iterates is beneficial in this setting.","PeriodicalId":49529,"journal":{"name":"SIAM Journal on Optimization","volume":"22 1","pages":""},"PeriodicalIF":2.6000,"publicationDate":"2024-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIAM Journal on Optimization","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1137/23m1575391","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
SIAM Journal on Optimization, Volume 34, Issue 3, Page 3033-3063, September 2024. Abstract. The performance estimation problem methodology makes it possible to determine the exact worst-case performance of an optimization method. In this work, we generalize this framework to first-order methods involving linear operators. This extension requires an explicit formulation of interpolation conditions for those linear operators. We consider the class of linear operators [math], where matrix [math] has bounded singular values, and the class of linear operators, where [math] is symmetric and has bounded eigenvalues. We describe interpolation conditions for these classes, i.e., necessary and sufficient conditions that, given a list of pairs [math], characterize the existence of a linear operator mapping [math] to [math] for all [math]. Using these conditions, we first identify the exact worst-case behavior of the gradient method applied to the composed objective [math], and observe that it always corresponds to [math] being a scaling operator. We then investigate the Chambolle–Pock method applied to [math], and improve the existing analysis to obtain a proof of the exact convergence rate of the primal-dual gap. In addition, we study how this method behaves on Lipschitz convex functions, and obtain a numerical convergence rate for the primal accuracy of the last iterate. We also show numerically that averaging iterates is beneficial in this setting.
期刊介绍:
The SIAM Journal on Optimization contains research articles on the theory and practice of optimization. The areas addressed include linear and quadratic programming, convex programming, nonlinear programming, complementarity problems, stochastic optimization, combinatorial optimization, integer programming, and convex, nonsmooth and variational analysis. Contributions may emphasize optimization theory, algorithms, software, computational practice, applications, or the links between these subjects.