两种具有全局收敛性的分散共轭梯度法

Liping Wang, Hao Wu, Hongchao Zhang
{"title":"两种具有全局收敛性的分散共轭梯度法","authors":"Liping Wang, Hao Wu, Hongchao Zhang","doi":"arxiv-2409.07122","DOIUrl":null,"url":null,"abstract":"This paper considers the decentralized optimization problem of minimizing a\nfinite sum of continuously differentiable functions over a fixed-connected\nundirected network. Summarizing the lack of previously developed decentralized\nconjugate gradient methods, we propose two decentralized conjugate gradient\nmethod, called NDCG and DMBFGS respectively. Firstly, the best of our\nknowledge, NDCG is the first decentralized conjugate gradient method to be\nshown to have global convergence with constant stepsizes for general nonconvex\noptimization problems, which profits from our designed conjugate parameter and\nrelies only on the same mild conditions as the centralized conjugate gradient\nmethod. Secondly, we apply the memoryless BFGS technique and develop the DMBFGS\nmethod. It requires only vector-vector products to capture the curvature\ninformation of Hessian matrices. Under proper choice of stepsizes, DMBFGS has\nglobal linear convergence for solving strongly convex decentralized\noptimization problems. Our numerical results show DMBFGS is very efficient\ncompared with other state-of-the-art methods for solving decentralized\noptimization.","PeriodicalId":501286,"journal":{"name":"arXiv - MATH - Optimization and Control","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Two Decentralized Conjugate Gradient Methods with Global Convergence\",\"authors\":\"Liping Wang, Hao Wu, Hongchao Zhang\",\"doi\":\"arxiv-2409.07122\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper considers the decentralized optimization problem of minimizing a\\nfinite sum of continuously differentiable functions over a fixed-connected\\nundirected network. Summarizing the lack of previously developed decentralized\\nconjugate gradient methods, we propose two decentralized conjugate gradient\\nmethod, called NDCG and DMBFGS respectively. Firstly, the best of our\\nknowledge, NDCG is the first decentralized conjugate gradient method to be\\nshown to have global convergence with constant stepsizes for general nonconvex\\noptimization problems, which profits from our designed conjugate parameter and\\nrelies only on the same mild conditions as the centralized conjugate gradient\\nmethod. Secondly, we apply the memoryless BFGS technique and develop the DMBFGS\\nmethod. It requires only vector-vector products to capture the curvature\\ninformation of Hessian matrices. Under proper choice of stepsizes, DMBFGS has\\nglobal linear convergence for solving strongly convex decentralized\\noptimization problems. Our numerical results show DMBFGS is very efficient\\ncompared with other state-of-the-art methods for solving decentralized\\noptimization.\",\"PeriodicalId\":501286,\"journal\":{\"name\":\"arXiv - MATH - Optimization and Control\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - MATH - Optimization and Control\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.07122\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - MATH - Optimization and Control","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.07122","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

本文考虑的是在固定连接的定向网络上最小化连续可微分函数无穷和的分散优化问题。在总结了以往分散共轭梯度方法的不足后,我们提出了两种分散共轭梯度方法,分别称为 NDCG 和 DMBFGS。首先,据我们所知,NDCG 是第一个被证明对一般非凸优化问题具有全局收敛性和恒定步长的分散共轭梯度方法,它得益于我们设计的共轭参数,并且只依赖于与集中共轭梯度方法相同的温和条件。其次,我们应用无记忆 BFGS 技术,开发了 DMBFGS 方法。该方法只需要矢量-矢量乘积就能捕捉到 Hessian 矩阵的曲率信息。在适当选择步长的情况下,DMBFGS 在求解强凸分散优化问题时具有全局线性收敛性。我们的数值结果表明,与其他最先进的分散优化求解方法相比,DMBFGS 非常高效。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Two Decentralized Conjugate Gradient Methods with Global Convergence
This paper considers the decentralized optimization problem of minimizing a finite sum of continuously differentiable functions over a fixed-connected undirected network. Summarizing the lack of previously developed decentralized conjugate gradient methods, we propose two decentralized conjugate gradient method, called NDCG and DMBFGS respectively. Firstly, the best of our knowledge, NDCG is the first decentralized conjugate gradient method to be shown to have global convergence with constant stepsizes for general nonconvex optimization problems, which profits from our designed conjugate parameter and relies only on the same mild conditions as the centralized conjugate gradient method. Secondly, we apply the memoryless BFGS technique and develop the DMBFGS method. It requires only vector-vector products to capture the curvature information of Hessian matrices. Under proper choice of stepsizes, DMBFGS has global linear convergence for solving strongly convex decentralized optimization problems. Our numerical results show DMBFGS is very efficient compared with other state-of-the-art methods for solving decentralized optimization.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信