Woocheol Choi, Changbum Chun, Yoon Mo Jung, Sangwoon Yun
{"title":"论黎曼近似梯度法的线性收敛速率","authors":"Woocheol Choi, Changbum Chun, Yoon Mo Jung, Sangwoon Yun","doi":"10.1007/s11590-024-02129-6","DOIUrl":null,"url":null,"abstract":"<p>Composite optimization problems on Riemannian manifolds arise in applications such as sparse principal component analysis and dictionary learning. Recently, Huang and Wei introduced a Riemannian proximal gradient method (Huang and Wei in MP 194:371–413, 2022) and an inexact Riemannian proximal gradient method (Wen and Ke in COA 85:1–32, 2023), utilizing the retraction mapping to address these challenges. They established the sublinear convergence rate of the Riemannian proximal gradient method under the retraction convexity and a geometric condition on retractions, as well as the local linear convergence rate of the inexact Riemannian proximal gradient method under the Riemannian Kurdyka-Lojasiewicz property. In this paper, we demonstrate the linear convergence rate of the Riemannian proximal gradient method and the linear convergence rate of the proximal gradient method proposed in Chen et al. (SIAM J Opt 30:210–239, 2020) under strong retraction convexity. Additionally, we provide a counterexample that violates the geometric condition on retractions, which is crucial for establishing the sublinear convergence rate of the Riemannian proximal gradient method.</p>","PeriodicalId":49720,"journal":{"name":"Optimization Letters","volume":"5 1","pages":""},"PeriodicalIF":1.3000,"publicationDate":"2024-06-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"On the linear convergence rate of Riemannian proximal gradient method\",\"authors\":\"Woocheol Choi, Changbum Chun, Yoon Mo Jung, Sangwoon Yun\",\"doi\":\"10.1007/s11590-024-02129-6\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Composite optimization problems on Riemannian manifolds arise in applications such as sparse principal component analysis and dictionary learning. Recently, Huang and Wei introduced a Riemannian proximal gradient method (Huang and Wei in MP 194:371–413, 2022) and an inexact Riemannian proximal gradient method (Wen and Ke in COA 85:1–32, 2023), utilizing the retraction mapping to address these challenges. They established the sublinear convergence rate of the Riemannian proximal gradient method under the retraction convexity and a geometric condition on retractions, as well as the local linear convergence rate of the inexact Riemannian proximal gradient method under the Riemannian Kurdyka-Lojasiewicz property. In this paper, we demonstrate the linear convergence rate of the Riemannian proximal gradient method and the linear convergence rate of the proximal gradient method proposed in Chen et al. (SIAM J Opt 30:210–239, 2020) under strong retraction convexity. Additionally, we provide a counterexample that violates the geometric condition on retractions, which is crucial for establishing the sublinear convergence rate of the Riemannian proximal gradient method.</p>\",\"PeriodicalId\":49720,\"journal\":{\"name\":\"Optimization Letters\",\"volume\":\"5 1\",\"pages\":\"\"},\"PeriodicalIF\":1.3000,\"publicationDate\":\"2024-06-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Optimization Letters\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1007/s11590-024-02129-6\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optimization Letters","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s11590-024-02129-6","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
On the linear convergence rate of Riemannian proximal gradient method
Composite optimization problems on Riemannian manifolds arise in applications such as sparse principal component analysis and dictionary learning. Recently, Huang and Wei introduced a Riemannian proximal gradient method (Huang and Wei in MP 194:371–413, 2022) and an inexact Riemannian proximal gradient method (Wen and Ke in COA 85:1–32, 2023), utilizing the retraction mapping to address these challenges. They established the sublinear convergence rate of the Riemannian proximal gradient method under the retraction convexity and a geometric condition on retractions, as well as the local linear convergence rate of the inexact Riemannian proximal gradient method under the Riemannian Kurdyka-Lojasiewicz property. In this paper, we demonstrate the linear convergence rate of the Riemannian proximal gradient method and the linear convergence rate of the proximal gradient method proposed in Chen et al. (SIAM J Opt 30:210–239, 2020) under strong retraction convexity. Additionally, we provide a counterexample that violates the geometric condition on retractions, which is crucial for establishing the sublinear convergence rate of the Riemannian proximal gradient method.
期刊介绍:
Optimization Letters is an international journal covering all aspects of optimization, including theory, algorithms, computational studies, and applications, and providing an outlet for rapid publication of short communications in the field. Originality, significance, quality and clarity are the essential criteria for choosing the material to be published.
Optimization Letters has been expanding in all directions at an astonishing rate during the last few decades. New algorithmic and theoretical techniques have been developed, the diffusion into other disciplines has proceeded at a rapid pace, and our knowledge of all aspects of the field has grown even more profound. At the same time one of the most striking trends in optimization is the constantly increasing interdisciplinary nature of the field.
Optimization Letters aims to communicate in a timely fashion all recent developments in optimization with concise short articles (limited to a total of ten journal pages). Such concise articles will be easily accessible by readers working in any aspects of optimization and wish to be informed of recent developments.