{"title":"两种具有全局收敛性的分散共轭梯度法","authors":"Liping Wang, Hao Wu, Hongchao Zhang","doi":"arxiv-2409.07122","DOIUrl":null,"url":null,"abstract":"This paper considers the decentralized optimization problem of minimizing a\nfinite sum of continuously differentiable functions over a fixed-connected\nundirected network. Summarizing the lack of previously developed decentralized\nconjugate gradient methods, we propose two decentralized conjugate gradient\nmethod, called NDCG and DMBFGS respectively. Firstly, the best of our\nknowledge, NDCG is the first decentralized conjugate gradient method to be\nshown to have global convergence with constant stepsizes for general nonconvex\noptimization problems, which profits from our designed conjugate parameter and\nrelies only on the same mild conditions as the centralized conjugate gradient\nmethod. Secondly, we apply the memoryless BFGS technique and develop the DMBFGS\nmethod. It requires only vector-vector products to capture the curvature\ninformation of Hessian matrices. Under proper choice of stepsizes, DMBFGS has\nglobal linear convergence for solving strongly convex decentralized\noptimization problems. Our numerical results show DMBFGS is very efficient\ncompared with other state-of-the-art methods for solving decentralized\noptimization.","PeriodicalId":501286,"journal":{"name":"arXiv - MATH - Optimization and Control","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Two Decentralized Conjugate Gradient Methods with Global Convergence\",\"authors\":\"Liping Wang, Hao Wu, Hongchao Zhang\",\"doi\":\"arxiv-2409.07122\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper considers the decentralized optimization problem of minimizing a\\nfinite sum of continuously differentiable functions over a fixed-connected\\nundirected network. Summarizing the lack of previously developed decentralized\\nconjugate gradient methods, we propose two decentralized conjugate gradient\\nmethod, called NDCG and DMBFGS respectively. Firstly, the best of our\\nknowledge, NDCG is the first decentralized conjugate gradient method to be\\nshown to have global convergence with constant stepsizes for general nonconvex\\noptimization problems, which profits from our designed conjugate parameter and\\nrelies only on the same mild conditions as the centralized conjugate gradient\\nmethod. Secondly, we apply the memoryless BFGS technique and develop the DMBFGS\\nmethod. It requires only vector-vector products to capture the curvature\\ninformation of Hessian matrices. Under proper choice of stepsizes, DMBFGS has\\nglobal linear convergence for solving strongly convex decentralized\\noptimization problems. Our numerical results show DMBFGS is very efficient\\ncompared with other state-of-the-art methods for solving decentralized\\noptimization.\",\"PeriodicalId\":501286,\"journal\":{\"name\":\"arXiv - MATH - Optimization and Control\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - MATH - Optimization and Control\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.07122\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - MATH - Optimization and Control","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.07122","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Two Decentralized Conjugate Gradient Methods with Global Convergence
This paper considers the decentralized optimization problem of minimizing a
finite sum of continuously differentiable functions over a fixed-connected
undirected network. Summarizing the lack of previously developed decentralized
conjugate gradient methods, we propose two decentralized conjugate gradient
method, called NDCG and DMBFGS respectively. Firstly, the best of our
knowledge, NDCG is the first decentralized conjugate gradient method to be
shown to have global convergence with constant stepsizes for general nonconvex
optimization problems, which profits from our designed conjugate parameter and
relies only on the same mild conditions as the centralized conjugate gradient
method. Secondly, we apply the memoryless BFGS technique and develop the DMBFGS
method. It requires only vector-vector products to capture the curvature
information of Hessian matrices. Under proper choice of stepsizes, DMBFGS has
global linear convergence for solving strongly convex decentralized
optimization problems. Our numerical results show DMBFGS is very efficient
compared with other state-of-the-art methods for solving decentralized
optimization.