Rómulo Bustincio, Allan M. de Souza, Joahannes B. D. da Costa, Luis F. G. Gonzalez, Luiz F. Bittencourt
{"title":"Reducing communication overhead through one-shot model pruning in federated learning","authors":"Rómulo Bustincio, Allan M. de Souza, Joahannes B. D. da Costa, Luis F. G. Gonzalez, Luiz F. Bittencourt","doi":"10.1007/s12243-025-01097-x","DOIUrl":null,"url":null,"abstract":"<div><p>In the realm of federated learning, a collaborative yet decentralized approach to machine learning, communication efficiency is a critical concern, particularly under constraints of limited bandwidth and resources. This paper evaluates FedSNIP, a novel method that leverages the SNIP (Single-shot Network Pruning based on Connection Sensitivity) technique within this context. By utilizing SNIP, FedSNIP effectively prunes neural networks, converting numerous weights to zero and resulting in sparser weight representations. This substantial reduction in weight density significantly decreases the volume of parameters that need to be communicated to the server, thereby reducing the communication overhead. Our experiments on the CIFAR-10 and UCI-HAR dataset demonstrate that FedSNIP not only lowers the data transmission between clients and the server but also maintains competitive model accuracy, comparable to conventional federated learning models. Additionally, we analyze various compression algorithms applied after pruning, specifically evaluating the compressed sparse row, coordinate list, and compressed sparse column formats to identify the most efficient approach. Our results show that compressed sparse row not only compresses the data more effectively and quickly but also achieves the highest reduction in data size, making it the most suitable format for enhancing the efficiency of federated learning, particularly in scenarios with restricted communication capabilities.</p></div>","PeriodicalId":50761,"journal":{"name":"Annals of Telecommunications","volume":"80 9-10","pages":"901 - 913"},"PeriodicalIF":2.2000,"publicationDate":"2025-05-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Annals of Telecommunications","FirstCategoryId":"94","ListUrlMain":"https://link.springer.com/article/10.1007/s12243-025-01097-x","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"TELECOMMUNICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
In the realm of federated learning, a collaborative yet decentralized approach to machine learning, communication efficiency is a critical concern, particularly under constraints of limited bandwidth and resources. This paper evaluates FedSNIP, a novel method that leverages the SNIP (Single-shot Network Pruning based on Connection Sensitivity) technique within this context. By utilizing SNIP, FedSNIP effectively prunes neural networks, converting numerous weights to zero and resulting in sparser weight representations. This substantial reduction in weight density significantly decreases the volume of parameters that need to be communicated to the server, thereby reducing the communication overhead. Our experiments on the CIFAR-10 and UCI-HAR dataset demonstrate that FedSNIP not only lowers the data transmission between clients and the server but also maintains competitive model accuracy, comparable to conventional federated learning models. Additionally, we analyze various compression algorithms applied after pruning, specifically evaluating the compressed sparse row, coordinate list, and compressed sparse column formats to identify the most efficient approach. Our results show that compressed sparse row not only compresses the data more effectively and quickly but also achieves the highest reduction in data size, making it the most suitable format for enhancing the efficiency of federated learning, particularly in scenarios with restricted communication capabilities.
期刊介绍:
Annals of Telecommunications is an international journal publishing original peer-reviewed papers in the field of telecommunications. It covers all the essential branches of modern telecommunications, ranging from digital communications to communication networks and the internet, to software, protocols and services, uses and economics. This large spectrum of topics accounts for the rapid convergence through telecommunications of the underlying technologies in computers, communications, content management towards the emergence of the information and knowledge society. As a consequence, the Journal provides a medium for exchanging research results and technological achievements accomplished by the European and international scientific community from academia and industry.