{"title":"An Accelerated Successive Convex Approximation Scheme With Exact Step Sizes for L1-Regression","authors":"Lukas Schynol;Moritz Hemsing;Marius Pesavento","doi":"10.1109/OJSP.2025.3528875","DOIUrl":null,"url":null,"abstract":"We consider the minimization of <inline-formula><tex-math>$\\ell _{1}$</tex-math></inline-formula>-regularized least-squares problems. A recent optimization approach uses successive convex approximations with an exact line search, which is highly competitive, especially in sparse problem instances. This work proposes an acceleration scheme for the successive convex approximation technique with a negligible additional computational cost. We demonstrate this scheme by devising three related accelerated algorithms with provable convergence. The first introduces an additional descent step along the past optimization trajectory in the variable update that is inspired by Nesterov's accelerated gradient method and uses a closed-form step size. The second performs a simultaneous descent step along both the best response and the past trajectory, thereby finding a two-dimensional step size, also in closed-form. The third algorithm combines the previous two approaches. All algorithms are hyperparameter-free. Empirical results confirm that the acceleration approaches improve the convergence rate compared to benchmark algorithms, and that they retain the benefits of successive convex approximation also in non-sparse instances.","PeriodicalId":73300,"journal":{"name":"IEEE open journal of signal processing","volume":"6 ","pages":"184-193"},"PeriodicalIF":2.9000,"publicationDate":"2025-01-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10840211","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE open journal of signal processing","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10840211/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
We consider the minimization of $\ell _{1}$-regularized least-squares problems. A recent optimization approach uses successive convex approximations with an exact line search, which is highly competitive, especially in sparse problem instances. This work proposes an acceleration scheme for the successive convex approximation technique with a negligible additional computational cost. We demonstrate this scheme by devising three related accelerated algorithms with provable convergence. The first introduces an additional descent step along the past optimization trajectory in the variable update that is inspired by Nesterov's accelerated gradient method and uses a closed-form step size. The second performs a simultaneous descent step along both the best response and the past trajectory, thereby finding a two-dimensional step size, also in closed-form. The third algorithm combines the previous two approaches. All algorithms are hyperparameter-free. Empirical results confirm that the acceleration approaches improve the convergence rate compared to benchmark algorithms, and that they retain the benefits of successive convex approximation also in non-sparse instances.