{"title":"提高联邦学习的通信性能:网络视角","authors":"Marica Amadeo , Claudia Campolo , Giuseppe Ruggeri , Antonella Molinaro","doi":"10.1016/j.comnet.2025.111353","DOIUrl":null,"url":null,"abstract":"<div><div>Federated Learning (FL) is gaining momentum as a promising solution to enable the efficient and privacy-preserving distributed training of Machine Learning (ML) models. Unlike centralized ML solutions, only the ML model and its updates are transferred between the clients and the aggregator server, eliminating the need to share large datasets. Notwithstanding, poor connectivity conditions experienced over the path that interconnects the FL clients and the aggregator server, either due to (wireless) channel losses or congestion, may deteriorate the training convergence. Several methods have been devised to reduce the training duration, primarily by minimizing data transfer through the design of ML algorithms at the application level. However, these solutions still exhibit unsettled issues, as they may only reduce the communication footprint but do not improve the communication process as a whole. Differently, in this work, our aim is to improve FL data exchange from a networking perspective by promoting Information Centric Networking (ICN) approaches rather than host-centric TCP/IP-based solutions. To this aim, we analyze the impact that host-centric transport protocols as well as ICN approaches have on the FL performance, in terms of duration of the model training and exchanged data (model and updates) load, under different channel loss settings. We show that ICN-based FL solutions significantly reduce the network data load and decrease the duration of the training round by up to an order of magnitude for high channel loss rates.</div></div>","PeriodicalId":50637,"journal":{"name":"Computer Networks","volume":"267 ","pages":"Article 111353"},"PeriodicalIF":4.6000,"publicationDate":"2025-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Improving communication performance of Federated Learning: A networking perspective\",\"authors\":\"Marica Amadeo , Claudia Campolo , Giuseppe Ruggeri , Antonella Molinaro\",\"doi\":\"10.1016/j.comnet.2025.111353\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Federated Learning (FL) is gaining momentum as a promising solution to enable the efficient and privacy-preserving distributed training of Machine Learning (ML) models. Unlike centralized ML solutions, only the ML model and its updates are transferred between the clients and the aggregator server, eliminating the need to share large datasets. Notwithstanding, poor connectivity conditions experienced over the path that interconnects the FL clients and the aggregator server, either due to (wireless) channel losses or congestion, may deteriorate the training convergence. Several methods have been devised to reduce the training duration, primarily by minimizing data transfer through the design of ML algorithms at the application level. However, these solutions still exhibit unsettled issues, as they may only reduce the communication footprint but do not improve the communication process as a whole. Differently, in this work, our aim is to improve FL data exchange from a networking perspective by promoting Information Centric Networking (ICN) approaches rather than host-centric TCP/IP-based solutions. To this aim, we analyze the impact that host-centric transport protocols as well as ICN approaches have on the FL performance, in terms of duration of the model training and exchanged data (model and updates) load, under different channel loss settings. We show that ICN-based FL solutions significantly reduce the network data load and decrease the duration of the training round by up to an order of magnitude for high channel loss rates.</div></div>\",\"PeriodicalId\":50637,\"journal\":{\"name\":\"Computer Networks\",\"volume\":\"267 \",\"pages\":\"Article 111353\"},\"PeriodicalIF\":4.6000,\"publicationDate\":\"2025-05-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computer Networks\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1389128625003202\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1389128625003202","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE","Score":null,"Total":0}
Improving communication performance of Federated Learning: A networking perspective
Federated Learning (FL) is gaining momentum as a promising solution to enable the efficient and privacy-preserving distributed training of Machine Learning (ML) models. Unlike centralized ML solutions, only the ML model and its updates are transferred between the clients and the aggregator server, eliminating the need to share large datasets. Notwithstanding, poor connectivity conditions experienced over the path that interconnects the FL clients and the aggregator server, either due to (wireless) channel losses or congestion, may deteriorate the training convergence. Several methods have been devised to reduce the training duration, primarily by minimizing data transfer through the design of ML algorithms at the application level. However, these solutions still exhibit unsettled issues, as they may only reduce the communication footprint but do not improve the communication process as a whole. Differently, in this work, our aim is to improve FL data exchange from a networking perspective by promoting Information Centric Networking (ICN) approaches rather than host-centric TCP/IP-based solutions. To this aim, we analyze the impact that host-centric transport protocols as well as ICN approaches have on the FL performance, in terms of duration of the model training and exchanged data (model and updates) load, under different channel loss settings. We show that ICN-based FL solutions significantly reduce the network data load and decrease the duration of the training round by up to an order of magnitude for high channel loss rates.
期刊介绍:
Computer Networks is an international, archival journal providing a publication vehicle for complete coverage of all topics of interest to those involved in the computer communications networking area. The audience includes researchers, managers and operators of networks as well as designers and implementors. The Editorial Board will consider any material for publication that is of interest to those groups.