Homayun Afrabandpey, Rangu Goutham, Honglei Zhang, Francesco Criri, Emre B. Aksu, H. R. Tavakoli
{"title":"On the Importance of Temporal Dependencies of Weight Updates in Communication Efficient Federated Learning","authors":"Homayun Afrabandpey, Rangu Goutham, Honglei Zhang, Francesco Criri, Emre B. Aksu, H. R. Tavakoli","doi":"10.1109/VCIP56404.2022.10008860","DOIUrl":null,"url":null,"abstract":"This paper studies the effect of exploiting temporal dependency of successive weight updates on compressing communications in Federated Learning (FL). For this, we propose residual coding for FL, which utilizes temporal dependencies by communicating compressed residuals of the weight updates whenever they are beneficial to bandwidth. We further consider Temporal Context Adaptation (TCA) which compares co-located elements of consecutive weight updates to select optimal setting for compression of bitstream in DeepCABAC encoder. Following experimental settings of MPEG standard on Neural Network Compression (NNC), we demonstrate that both temporal dependency based technologies reduce communication overhead, where the maximum reduction is obtained using both technologies, simultaneously.","PeriodicalId":269379,"journal":{"name":"2022 IEEE International Conference on Visual Communications and Image Processing (VCIP)","volume":"35 12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Visual Communications and Image Processing (VCIP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/VCIP56404.2022.10008860","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
This paper studies the effect of exploiting temporal dependency of successive weight updates on compressing communications in Federated Learning (FL). For this, we propose residual coding for FL, which utilizes temporal dependencies by communicating compressed residuals of the weight updates whenever they are beneficial to bandwidth. We further consider Temporal Context Adaptation (TCA) which compares co-located elements of consecutive weight updates to select optimal setting for compression of bitstream in DeepCABAC encoder. Following experimental settings of MPEG standard on Neural Network Compression (NNC), we demonstrate that both temporal dependency based technologies reduce communication overhead, where the maximum reduction is obtained using both technologies, simultaneously.