{"title":"Frequency-Domain Representation of First-Order Methods: A Simple and Robust Framework of Analysis","authors":"Ioannis Anagnostides, Ioannis Panageas","doi":"10.1137/1.9781611977066.10","DOIUrl":null,"url":null,"abstract":"Motivated by recent applications in min-max optimization, we employ tools from nonlinear control theory in order to analyze a class of\"historical\"gradient-based methods, for which the next step lies in the span of the previously observed gradients within a time horizon. Specifically, we leverage techniques developed by Hu and Lessard (2017) to build a frequency-domain framework which reduces the analysis of such methods to numerically-solvable algebraic tasks, establishing linear convergence under a class of strongly monotone and co-coercive operators. On the applications' side, we focus on the Optimistic Gradient Descent (OGD) method, which augments the standard Gradient Descent with an additional past-gradient in the optimization step. The proposed framework leads to a simple and sharp analysis of OGD -- and generalizations thereof -- under a much broader regime of parameters. Notably, this characterization directly extends under adversarial noise in the observed value of the gradient. Moreover, our frequency-domain framework provides an exact quantitative comparison between simultaneous and alternating updates of OGD. An interesting byproduct is that OGD -- and variants thereof -- is an instance of PID control, arguably one of the most influential algorithms of the last century; this observation sheds more light to the stabilizing properties of\"optimism\".","PeriodicalId":93491,"journal":{"name":"Proceedings of the SIAM Symposium on Simplicity in Algorithms (SOSA)","volume":"38 1","pages":"131-160"},"PeriodicalIF":0.0000,"publicationDate":"2021-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the SIAM Symposium on Simplicity in Algorithms (SOSA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1137/1.9781611977066.10","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
Motivated by recent applications in min-max optimization, we employ tools from nonlinear control theory in order to analyze a class of"historical"gradient-based methods, for which the next step lies in the span of the previously observed gradients within a time horizon. Specifically, we leverage techniques developed by Hu and Lessard (2017) to build a frequency-domain framework which reduces the analysis of such methods to numerically-solvable algebraic tasks, establishing linear convergence under a class of strongly monotone and co-coercive operators. On the applications' side, we focus on the Optimistic Gradient Descent (OGD) method, which augments the standard Gradient Descent with an additional past-gradient in the optimization step. The proposed framework leads to a simple and sharp analysis of OGD -- and generalizations thereof -- under a much broader regime of parameters. Notably, this characterization directly extends under adversarial noise in the observed value of the gradient. Moreover, our frequency-domain framework provides an exact quantitative comparison between simultaneous and alternating updates of OGD. An interesting byproduct is that OGD -- and variants thereof -- is an instance of PID control, arguably one of the most influential algorithms of the last century; this observation sheds more light to the stabilizing properties of"optimism".