{"title":"最坏情况下的鲁棒联邦学习","authors":"F. Ang, Li Chen, Weidong Wang","doi":"10.1109/WCNC45663.2020.9120713","DOIUrl":null,"url":null,"abstract":"Federated learning provides a communication- efficient training process via alternating between local training and averaging updated local model. Nevertheless, it requires perfectly acquisition of the model which is hard to achieve in wireless communication practically, and the noise will cause serious effect on federated learning. To tackle this challenge, we propose a robust design for federated learning to decline the effect of noise. Considering the noise in communication steps, we first formulate the problem as the parallel optimization for each node under worst-case model. We utilize the sampling-based successive convex approximation algorithm to develop a feasible training scheme, due to the unavailable maxima noise condition and non-convex issue of the objective function. In addition, the convergence rate of proposed design are analyzed from a theoretical point of view. Finally, the prediction accuracy improvement and loss function value reduction of the proposed design are demonstrated via simulation.","PeriodicalId":415064,"journal":{"name":"2020 IEEE Wireless Communications and Networking Conference (WCNC)","volume":"33 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Robust Federated Learning Under Worst-Case Model\",\"authors\":\"F. Ang, Li Chen, Weidong Wang\",\"doi\":\"10.1109/WCNC45663.2020.9120713\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Federated learning provides a communication- efficient training process via alternating between local training and averaging updated local model. Nevertheless, it requires perfectly acquisition of the model which is hard to achieve in wireless communication practically, and the noise will cause serious effect on federated learning. To tackle this challenge, we propose a robust design for federated learning to decline the effect of noise. Considering the noise in communication steps, we first formulate the problem as the parallel optimization for each node under worst-case model. We utilize the sampling-based successive convex approximation algorithm to develop a feasible training scheme, due to the unavailable maxima noise condition and non-convex issue of the objective function. In addition, the convergence rate of proposed design are analyzed from a theoretical point of view. Finally, the prediction accuracy improvement and loss function value reduction of the proposed design are demonstrated via simulation.\",\"PeriodicalId\":415064,\"journal\":{\"name\":\"2020 IEEE Wireless Communications and Networking Conference (WCNC)\",\"volume\":\"33 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-05-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 IEEE Wireless Communications and Networking Conference (WCNC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/WCNC45663.2020.9120713\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE Wireless Communications and Networking Conference (WCNC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WCNC45663.2020.9120713","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Federated learning provides a communication- efficient training process via alternating between local training and averaging updated local model. Nevertheless, it requires perfectly acquisition of the model which is hard to achieve in wireless communication practically, and the noise will cause serious effect on federated learning. To tackle this challenge, we propose a robust design for federated learning to decline the effect of noise. Considering the noise in communication steps, we first formulate the problem as the parallel optimization for each node under worst-case model. We utilize the sampling-based successive convex approximation algorithm to develop a feasible training scheme, due to the unavailable maxima noise condition and non-convex issue of the objective function. In addition, the convergence rate of proposed design are analyzed from a theoretical point of view. Finally, the prediction accuracy improvement and loss function value reduction of the proposed design are demonstrated via simulation.