{"title":"通过平均梯度流进行黎曼联盟学习","authors":"Zhenwei Huang, Wen Huang, Pratik Jawanpuria, Bamdev Mishra","doi":"arxiv-2409.07223","DOIUrl":null,"url":null,"abstract":"In recent years, federated learning has garnered significant attention as an\nefficient and privacy-preserving distributed learning paradigm. In the\nEuclidean setting, Federated Averaging (FedAvg) and its variants are a class of\nefficient algorithms for expected (empirical) risk minimization. This paper\ndevelops and analyzes a Riemannian Federated Averaging Gradient Stream\n(RFedAGS) algorithm, which is a generalization of FedAvg, to problems defined\non a Riemannian manifold. Under standard assumptions, the convergence rate of\nRFedAGS with fixed step sizes is proven to be sublinear for an approximate\nstationary solution. If decaying step sizes are used, the global convergence is\nestablished. Furthermore, assuming that the objective obeys the Riemannian\nPolyak-{\\L}ojasiewicz property, the optimal gaps generated by RFedAGS with\nfixed step size are linearly decreasing up to a tiny upper bound, meanwhile, if\ndecaying step sizes are used, then the gaps sublinearly vanish. Numerical simulations conducted on synthetic and real-world data demonstrate\nthe performance of the proposed RFedAGS.","PeriodicalId":501286,"journal":{"name":"arXiv - MATH - Optimization and Control","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Riemannian Federated Learning via Averaging Gradient Stream\",\"authors\":\"Zhenwei Huang, Wen Huang, Pratik Jawanpuria, Bamdev Mishra\",\"doi\":\"arxiv-2409.07223\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In recent years, federated learning has garnered significant attention as an\\nefficient and privacy-preserving distributed learning paradigm. In the\\nEuclidean setting, Federated Averaging (FedAvg) and its variants are a class of\\nefficient algorithms for expected (empirical) risk minimization. This paper\\ndevelops and analyzes a Riemannian Federated Averaging Gradient Stream\\n(RFedAGS) algorithm, which is a generalization of FedAvg, to problems defined\\non a Riemannian manifold. Under standard assumptions, the convergence rate of\\nRFedAGS with fixed step sizes is proven to be sublinear for an approximate\\nstationary solution. If decaying step sizes are used, the global convergence is\\nestablished. Furthermore, assuming that the objective obeys the Riemannian\\nPolyak-{\\\\L}ojasiewicz property, the optimal gaps generated by RFedAGS with\\nfixed step size are linearly decreasing up to a tiny upper bound, meanwhile, if\\ndecaying step sizes are used, then the gaps sublinearly vanish. Numerical simulations conducted on synthetic and real-world data demonstrate\\nthe performance of the proposed RFedAGS.\",\"PeriodicalId\":501286,\"journal\":{\"name\":\"arXiv - MATH - Optimization and Control\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - MATH - Optimization and Control\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.07223\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - MATH - Optimization and Control","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.07223","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Riemannian Federated Learning via Averaging Gradient Stream
In recent years, federated learning has garnered significant attention as an
efficient and privacy-preserving distributed learning paradigm. In the
Euclidean setting, Federated Averaging (FedAvg) and its variants are a class of
efficient algorithms for expected (empirical) risk minimization. This paper
develops and analyzes a Riemannian Federated Averaging Gradient Stream
(RFedAGS) algorithm, which is a generalization of FedAvg, to problems defined
on a Riemannian manifold. Under standard assumptions, the convergence rate of
RFedAGS with fixed step sizes is proven to be sublinear for an approximate
stationary solution. If decaying step sizes are used, the global convergence is
established. Furthermore, assuming that the objective obeys the Riemannian
Polyak-{\L}ojasiewicz property, the optimal gaps generated by RFedAGS with
fixed step size are linearly decreasing up to a tiny upper bound, meanwhile, if
decaying step sizes are used, then the gaps sublinearly vanish. Numerical simulations conducted on synthetic and real-world data demonstrate
the performance of the proposed RFedAGS.