Aleksandr Y. Aravkin, Robert Baraldi, Dominique Orban
{"title":"A Levenberg–Marquardt Method for Nonsmooth Regularized Least Squares","authors":"Aleksandr Y. Aravkin, Robert Baraldi, Dominique Orban","doi":"10.1137/22m1538971","DOIUrl":null,"url":null,"abstract":"SIAM Journal on Scientific Computing, Volume 46, Issue 4, Page A2557-A2581, August 2024. <br/> Abstract. We develop a Levenberg–Marquardt method for minimizing the sum of a smooth nonlinear least-squares term [math] and a nonsmooth term [math]. Both [math] and [math] may be nonconvex. Steps are computed by minimizing the sum of a regularized linear least-squares model and a model of [math] using a first-order method such as the proximal gradient method. We establish global convergence to a first-order stationary point under the assumptions that [math] and its Jacobian are Lipschitz continuous and [math] is proper and lower semicontinuous. In the worst case, our method performs [math] iterations to bring a measure of stationarity below [math]. We also derive a trust-region variant that enjoys similar asymptotic worst-case iteration complexity as a special case of the trust-region algorithm of Aravkin, Baraldi, and Orban [SIAM J. Optim., 32 (2022), pp. 900–929]. We report numerical results on three examples: a group-lasso basis-pursuit denoise example, a nonlinear support vector machine, and parameter estimation in a neuroscience application. To implement those examples, we describe in detail how to evaluate proximal operators for separable [math] and for the group lasso with trust-region constraint. In all cases, the Levenberg–Marquardt methods perform fewer outer iterations than either a proximal gradient method with adaptive step length or a quasi-Newton trust-region method, neither of which exploit the least-squares structure of the problem. Our results also highlight the need for more sophisticated subproblem solvers than simple first-order methods.","PeriodicalId":49526,"journal":{"name":"SIAM Journal on Scientific Computing","volume":"34 1","pages":""},"PeriodicalIF":3.0000,"publicationDate":"2024-08-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIAM Journal on Scientific Computing","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1137/22m1538971","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
SIAM Journal on Scientific Computing, Volume 46, Issue 4, Page A2557-A2581, August 2024. Abstract. We develop a Levenberg–Marquardt method for minimizing the sum of a smooth nonlinear least-squares term [math] and a nonsmooth term [math]. Both [math] and [math] may be nonconvex. Steps are computed by minimizing the sum of a regularized linear least-squares model and a model of [math] using a first-order method such as the proximal gradient method. We establish global convergence to a first-order stationary point under the assumptions that [math] and its Jacobian are Lipschitz continuous and [math] is proper and lower semicontinuous. In the worst case, our method performs [math] iterations to bring a measure of stationarity below [math]. We also derive a trust-region variant that enjoys similar asymptotic worst-case iteration complexity as a special case of the trust-region algorithm of Aravkin, Baraldi, and Orban [SIAM J. Optim., 32 (2022), pp. 900–929]. We report numerical results on three examples: a group-lasso basis-pursuit denoise example, a nonlinear support vector machine, and parameter estimation in a neuroscience application. To implement those examples, we describe in detail how to evaluate proximal operators for separable [math] and for the group lasso with trust-region constraint. In all cases, the Levenberg–Marquardt methods perform fewer outer iterations than either a proximal gradient method with adaptive step length or a quasi-Newton trust-region method, neither of which exploit the least-squares structure of the problem. Our results also highlight the need for more sophisticated subproblem solvers than simple first-order methods.
期刊介绍:
The purpose of SIAM Journal on Scientific Computing (SISC) is to advance computational methods for solving scientific and engineering problems.
SISC papers are classified into three categories:
1. Methods and Algorithms for Scientific Computing: Papers in this category may include theoretical analysis, provided that the relevance to applications in science and engineering is demonstrated. They should contain meaningful computational results and theoretical results or strong heuristics supporting the performance of new algorithms.
2. Computational Methods in Science and Engineering: Papers in this section will typically describe novel methodologies for solving a specific problem in computational science or engineering. They should contain enough information about the application to orient other computational scientists but should omit details of interest mainly to the applications specialist.
3. Software and High-Performance Computing: Papers in this category should concern the novel design and development of computational methods and high-quality software, parallel algorithms, high-performance computing issues, new architectures, data analysis, or visualization. The primary focus should be on computational methods that have potentially large impact for an important class of scientific or engineering problems.