Long Tan Le, Tung-Anh Nguyen, Tuan-Dung Nguyen, Nguyen H. Tran, Nguyen Binh Truong, Phuong L. Vo, Bui Thanh Hung, Tuan Anh Le
{"title":"Distributionally Robust Federated Learning for Mobile Edge Networks","authors":"Long Tan Le, Tung-Anh Nguyen, Tuan-Dung Nguyen, Nguyen H. Tran, Nguyen Binh Truong, Phuong L. Vo, Bui Thanh Hung, Tuan Anh Le","doi":"10.1007/s11036-024-02316-w","DOIUrl":null,"url":null,"abstract":"<p>Federated Learning (FL) revolutionizes data processing in mobile networks by enabling collaborative learning without data exchange. This not only reduces latency and enhances computational efficiency but also enables the system to adapt, learn and optimize the performance from the user’s context in real-time. Nevertheless, FL faces challenges in training and generalization due to statistical heterogeneity, stemming from the diverse data nature across varying user contexts. To address these challenges, we propose <span>\\(\\textsf {WAFL}\\)</span>, a robust FL framework grounded in Wasserstein distributionally robust optimization, aimed at enhancing model generalization against all adversarial distributions within a predefined Wasserstein ambiguity set. We approach <span>\\(\\textsf {WAFL}\\)</span> by formulating it as an empirical surrogate risk minimization problem, which is then solved using a novel federated algorithm. Experimental results demonstrate that <span>\\(\\textsf {WAFL}\\)</span> outperforms other robust FL baselines in non-i.i.d settings, showcasing superior generalization and robustness to significant distribution shifts.</p>","PeriodicalId":501103,"journal":{"name":"Mobile Networks and Applications","volume":"19 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-05-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Mobile Networks and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s11036-024-02316-w","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Federated Learning (FL) revolutionizes data processing in mobile networks by enabling collaborative learning without data exchange. This not only reduces latency and enhances computational efficiency but also enables the system to adapt, learn and optimize the performance from the user’s context in real-time. Nevertheless, FL faces challenges in training and generalization due to statistical heterogeneity, stemming from the diverse data nature across varying user contexts. To address these challenges, we propose \(\textsf {WAFL}\), a robust FL framework grounded in Wasserstein distributionally robust optimization, aimed at enhancing model generalization against all adversarial distributions within a predefined Wasserstein ambiguity set. We approach \(\textsf {WAFL}\) by formulating it as an empirical surrogate risk minimization problem, which is then solved using a novel federated algorithm. Experimental results demonstrate that \(\textsf {WAFL}\) outperforms other robust FL baselines in non-i.i.d settings, showcasing superior generalization and robustness to significant distribution shifts.