Ehsan Lari, Vinay Chakravarthi Gogineni, R. Arablouei, Stefan Werner
{"title":"对通信错误具有鲁棒性的资源高效联邦学习","authors":"Ehsan Lari, Vinay Chakravarthi Gogineni, R. Arablouei, Stefan Werner","doi":"10.1109/SSP53291.2023.10208024","DOIUrl":null,"url":null,"abstract":"The effectiveness of federated learning (FL) in leveraging distributed datasets is highly contingent upon the accuracy of model exchanges between clients and servers. Communication errors caused by noisy links can negatively impact learning accuracy. To address this issue, we present an FL algorithm that is robust to communication errors while reducing the communication load on clients. To derive the proposed algorithm, we consider a weighted least-squares regression problem as a motivating example. We cast the considered problem as a distributed optimization problem over a federated network, which employs random scheduling to enhance communication efficiency, and solve it using the alternating direction method of multipliers. To improve robustness, we eliminate the local dual parameters and reduce the number of global model exchanges via a change of variable. We analyze the mean convergence of our proposed algorithm and demonstrate its effectiveness compared with related existing algorithms via simulations.","PeriodicalId":296346,"journal":{"name":"2023 IEEE Statistical Signal Processing Workshop (SSP)","volume":"128 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Resource-Efficient Federated Learning Robust to Communication Errors\",\"authors\":\"Ehsan Lari, Vinay Chakravarthi Gogineni, R. Arablouei, Stefan Werner\",\"doi\":\"10.1109/SSP53291.2023.10208024\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The effectiveness of federated learning (FL) in leveraging distributed datasets is highly contingent upon the accuracy of model exchanges between clients and servers. Communication errors caused by noisy links can negatively impact learning accuracy. To address this issue, we present an FL algorithm that is robust to communication errors while reducing the communication load on clients. To derive the proposed algorithm, we consider a weighted least-squares regression problem as a motivating example. We cast the considered problem as a distributed optimization problem over a federated network, which employs random scheduling to enhance communication efficiency, and solve it using the alternating direction method of multipliers. To improve robustness, we eliminate the local dual parameters and reduce the number of global model exchanges via a change of variable. We analyze the mean convergence of our proposed algorithm and demonstrate its effectiveness compared with related existing algorithms via simulations.\",\"PeriodicalId\":296346,\"journal\":{\"name\":\"2023 IEEE Statistical Signal Processing Workshop (SSP)\",\"volume\":\"128 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-07-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 IEEE Statistical Signal Processing Workshop (SSP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SSP53291.2023.10208024\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE Statistical Signal Processing Workshop (SSP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SSP53291.2023.10208024","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Resource-Efficient Federated Learning Robust to Communication Errors
The effectiveness of federated learning (FL) in leveraging distributed datasets is highly contingent upon the accuracy of model exchanges between clients and servers. Communication errors caused by noisy links can negatively impact learning accuracy. To address this issue, we present an FL algorithm that is robust to communication errors while reducing the communication load on clients. To derive the proposed algorithm, we consider a weighted least-squares regression problem as a motivating example. We cast the considered problem as a distributed optimization problem over a federated network, which employs random scheduling to enhance communication efficiency, and solve it using the alternating direction method of multipliers. To improve robustness, we eliminate the local dual parameters and reduce the number of global model exchanges via a change of variable. We analyze the mean convergence of our proposed algorithm and demonstrate its effectiveness compared with related existing algorithms via simulations.