Roberto Andreani, Kelvin R. Couto, Orizon P. Ferreira, Gabriel Haeser
{"title":"Constraint Qualifications and Strong Global Convergence Properties of an Augmented Lagrangian Method on Riemannian Manifolds","authors":"Roberto Andreani, Kelvin R. Couto, Orizon P. Ferreira, Gabriel Haeser","doi":"10.1137/23m1582382","DOIUrl":null,"url":null,"abstract":"SIAM Journal on Optimization, Volume 34, Issue 2, Page 1799-1825, June 2024. <br/> Abstract. In the past several years, augmented Lagrangian methods have been successfully applied to several classes of nonconvex optimization problems, inspiring new developments in both theory and practice. In this paper we bring most of these recent developments from nonlinear programming to the context of optimization on Riemannian manifolds, including equality and inequality constraints. Many research have been conducted on optimization problems on manifolds, however only recently the treatment of the constrained case has been considered. In this paper we propose to bridge this gap with respect to the most recent developments in nonlinear programming. In particular, we formulate several well-known constraint qualifications from the Euclidean context which are sufficient for guaranteeing global convergence of augmented Lagrangian methods, without requiring boundedness of the set of Lagrange multipliers. Convergence of the dual sequence can also be assured under a weak constraint qualification. The theory presented is based on so-called sequential optimality conditions, which is a powerful tool used in this context. The paper can also be read with the Euclidean context in mind, serving as a review of the most relevant constraint qualifications and global convergence theory of state-of-the-art augmented Lagrangian methods for nonlinear programming.","PeriodicalId":49529,"journal":{"name":"SIAM Journal on Optimization","volume":null,"pages":null},"PeriodicalIF":2.6000,"publicationDate":"2024-05-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIAM Journal on Optimization","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1137/23m1582382","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
SIAM Journal on Optimization, Volume 34, Issue 2, Page 1799-1825, June 2024. Abstract. In the past several years, augmented Lagrangian methods have been successfully applied to several classes of nonconvex optimization problems, inspiring new developments in both theory and practice. In this paper we bring most of these recent developments from nonlinear programming to the context of optimization on Riemannian manifolds, including equality and inequality constraints. Many research have been conducted on optimization problems on manifolds, however only recently the treatment of the constrained case has been considered. In this paper we propose to bridge this gap with respect to the most recent developments in nonlinear programming. In particular, we formulate several well-known constraint qualifications from the Euclidean context which are sufficient for guaranteeing global convergence of augmented Lagrangian methods, without requiring boundedness of the set of Lagrange multipliers. Convergence of the dual sequence can also be assured under a weak constraint qualification. The theory presented is based on so-called sequential optimality conditions, which is a powerful tool used in this context. The paper can also be read with the Euclidean context in mind, serving as a review of the most relevant constraint qualifications and global convergence theory of state-of-the-art augmented Lagrangian methods for nonlinear programming.
期刊介绍:
The SIAM Journal on Optimization contains research articles on the theory and practice of optimization. The areas addressed include linear and quadratic programming, convex programming, nonlinear programming, complementarity problems, stochastic optimization, combinatorial optimization, integer programming, and convex, nonsmooth and variational analysis. Contributions may emphasize optimization theory, algorithms, software, computational practice, applications, or the links between these subjects.