Alex Barros, D. Rosário, E. Cerqueira, N. L. D. Fonseca
{"title":"A strategy to the reduction of communication overhead and overfitting in Federated Learning","authors":"Alex Barros, D. Rosário, E. Cerqueira, N. L. D. Fonseca","doi":"10.5753/wgrs.2021.17181","DOIUrl":null,"url":null,"abstract":"Federated learning (FL) is a framework to train machine learning models using decentralized data, especially unbalanced and non-iid. Adaptive methods can be used to accelerate convergence, reducing the number of rounds of local computation and communication to a centralized server. This paper proposes an adaptive controller to adapt the number of epochs needed that employs Poisson distribution to avoid overfitting of the aggregated model, promoting fast convergence. Our results indicate that increasing the local update of the model should be avoided, but yet some complementary mechanism is needed to model performance. We evaluate the impact of an increasing number of epochs of FedAVG and FedADAM.","PeriodicalId":434358,"journal":{"name":"Anais do XXVI Workshop de Gerência e Operação de Redes e Serviços (WGRS 2021)","volume":"45 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Anais do XXVI Workshop de Gerência e Operação de Redes e Serviços (WGRS 2021)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5753/wgrs.2021.17181","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
Federated learning (FL) is a framework to train machine learning models using decentralized data, especially unbalanced and non-iid. Adaptive methods can be used to accelerate convergence, reducing the number of rounds of local computation and communication to a centralized server. This paper proposes an adaptive controller to adapt the number of epochs needed that employs Poisson distribution to avoid overfitting of the aggregated model, promoting fast convergence. Our results indicate that increasing the local update of the model should be avoided, but yet some complementary mechanism is needed to model performance. We evaluate the impact of an increasing number of epochs of FedAVG and FedADAM.