Juan Zhou, Kangkang Deng, Hongxia Wang, Zheng Peng
{"title":"Inexact Riemannian Gradient Descent Method for Nonconvex Optimization","authors":"Juan Zhou, Kangkang Deng, Hongxia Wang, Zheng Peng","doi":"arxiv-2409.11181","DOIUrl":null,"url":null,"abstract":"Gradient descent methods are fundamental first-order optimization algorithms\nin both Euclidean spaces and Riemannian manifolds. However, the exact gradient\nis not readily available in many scenarios. This paper proposes a novel inexact\nRiemannian gradient descent algorithm for nonconvex problems, accompanied by a\nconvergence guarantee. In particular, we establish two inexact gradient\nconditions on Riemannian manifolds for the first time, enabling precise\ngradient approximations. Our method demonstrates strong convergence results for\nboth gradient sequences and function values. The global convergence with\nconstructive convergence rates for the sequence of iterates is ensured under\nthe Riemannian Kurdyka-\\L ojasiewicz property. Furthermore, our algorithm\nencompasses two specific applications: Riemannian sharpness-aware minimization\nand Riemannian extragradient algorithm, both of which inherit the global\nconvergence properties of the inexact gradient methods. Numerical experiments\non low-rank matrix completion and principal component analysis problems\nvalidate the efficiency and practical relevance of the proposed approaches.","PeriodicalId":501286,"journal":{"name":"arXiv - MATH - Optimization and Control","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - MATH - Optimization and Control","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.11181","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Gradient descent methods are fundamental first-order optimization algorithms
in both Euclidean spaces and Riemannian manifolds. However, the exact gradient
is not readily available in many scenarios. This paper proposes a novel inexact
Riemannian gradient descent algorithm for nonconvex problems, accompanied by a
convergence guarantee. In particular, we establish two inexact gradient
conditions on Riemannian manifolds for the first time, enabling precise
gradient approximations. Our method demonstrates strong convergence results for
both gradient sequences and function values. The global convergence with
constructive convergence rates for the sequence of iterates is ensured under
the Riemannian Kurdyka-\L ojasiewicz property. Furthermore, our algorithm
encompasses two specific applications: Riemannian sharpness-aware minimization
and Riemannian extragradient algorithm, both of which inherit the global
convergence properties of the inexact gradient methods. Numerical experiments
on low-rank matrix completion and principal component analysis problems
validate the efficiency and practical relevance of the proposed approaches.