非凸优化的非精确黎曼梯度下降法

Juan Zhou, Kangkang Deng, Hongxia Wang, Zheng Peng
{"title":"非凸优化的非精确黎曼梯度下降法","authors":"Juan Zhou, Kangkang Deng, Hongxia Wang, Zheng Peng","doi":"arxiv-2409.11181","DOIUrl":null,"url":null,"abstract":"Gradient descent methods are fundamental first-order optimization algorithms\nin both Euclidean spaces and Riemannian manifolds. However, the exact gradient\nis not readily available in many scenarios. This paper proposes a novel inexact\nRiemannian gradient descent algorithm for nonconvex problems, accompanied by a\nconvergence guarantee. In particular, we establish two inexact gradient\nconditions on Riemannian manifolds for the first time, enabling precise\ngradient approximations. Our method demonstrates strong convergence results for\nboth gradient sequences and function values. The global convergence with\nconstructive convergence rates for the sequence of iterates is ensured under\nthe Riemannian Kurdyka-\\L ojasiewicz property. Furthermore, our algorithm\nencompasses two specific applications: Riemannian sharpness-aware minimization\nand Riemannian extragradient algorithm, both of which inherit the global\nconvergence properties of the inexact gradient methods. Numerical experiments\non low-rank matrix completion and principal component analysis problems\nvalidate the efficiency and practical relevance of the proposed approaches.","PeriodicalId":501286,"journal":{"name":"arXiv - MATH - Optimization and Control","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Inexact Riemannian Gradient Descent Method for Nonconvex Optimization\",\"authors\":\"Juan Zhou, Kangkang Deng, Hongxia Wang, Zheng Peng\",\"doi\":\"arxiv-2409.11181\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Gradient descent methods are fundamental first-order optimization algorithms\\nin both Euclidean spaces and Riemannian manifolds. However, the exact gradient\\nis not readily available in many scenarios. This paper proposes a novel inexact\\nRiemannian gradient descent algorithm for nonconvex problems, accompanied by a\\nconvergence guarantee. In particular, we establish two inexact gradient\\nconditions on Riemannian manifolds for the first time, enabling precise\\ngradient approximations. Our method demonstrates strong convergence results for\\nboth gradient sequences and function values. The global convergence with\\nconstructive convergence rates for the sequence of iterates is ensured under\\nthe Riemannian Kurdyka-\\\\L ojasiewicz property. Furthermore, our algorithm\\nencompasses two specific applications: Riemannian sharpness-aware minimization\\nand Riemannian extragradient algorithm, both of which inherit the global\\nconvergence properties of the inexact gradient methods. Numerical experiments\\non low-rank matrix completion and principal component analysis problems\\nvalidate the efficiency and practical relevance of the proposed approaches.\",\"PeriodicalId\":501286,\"journal\":{\"name\":\"arXiv - MATH - Optimization and Control\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - MATH - Optimization and Control\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.11181\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - MATH - Optimization and Control","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.11181","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

梯度下降法是欧几里得空间和黎曼流形中最基本的一阶优化算法。然而,在很多情况下,精确梯度并不容易获得。本文针对非凸问题提出了一种新颖的非精确黎曼梯度下降算法,并给出了收敛保证。特别是,我们首次在黎曼流形上建立了两个非精确梯度条件,从而实现了精确梯度逼近。我们的方法对梯度序列和函数值都有很强的收敛性。在黎曼 Kurdyka-\L ojasiewicz 特性下,确保了迭代序列的全局收敛性和结构收敛率。此外,我们的算法还包括两个具体应用:它们都继承了非精确梯度方法的全局收敛特性。低阶矩阵补全和主成分分析问题的数值实验验证了所提方法的效率和实用性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Inexact Riemannian Gradient Descent Method for Nonconvex Optimization
Gradient descent methods are fundamental first-order optimization algorithms in both Euclidean spaces and Riemannian manifolds. However, the exact gradient is not readily available in many scenarios. This paper proposes a novel inexact Riemannian gradient descent algorithm for nonconvex problems, accompanied by a convergence guarantee. In particular, we establish two inexact gradient conditions on Riemannian manifolds for the first time, enabling precise gradient approximations. Our method demonstrates strong convergence results for both gradient sequences and function values. The global convergence with constructive convergence rates for the sequence of iterates is ensured under the Riemannian Kurdyka-\L ojasiewicz property. Furthermore, our algorithm encompasses two specific applications: Riemannian sharpness-aware minimization and Riemannian extragradient algorithm, both of which inherit the global convergence properties of the inexact gradient methods. Numerical experiments on low-rank matrix completion and principal component analysis problems validate the efficiency and practical relevance of the proposed approaches.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信