Yan Gou, Ruiyu Wang, Zonghui Li, M. Imran, Lei Zhang
{"title":"Clustered Hierarchical Distributed Federated Learning","authors":"Yan Gou, Ruiyu Wang, Zonghui Li, M. Imran, Lei Zhang","doi":"10.1109/ICC45855.2022.9838880","DOIUrl":null,"url":null,"abstract":"In recent years, due to the increasing concern about data privacy security, federated learning, whose clients only synchronize the model rather than the personal data, has developed rapidly. However, the traditional federated learning system still has a high dependence on the central server, an unguaranteed enthusiasm of clients and reliability of the central server, and extremely high consumption of communication resources. Therefore, we propose Clustered Hierarchical Distributed Federated Learning to solve the above problems. We motivate the participation of clients by clustering and solve the dependence on the central server through distributed architecture. We apply a hierarchical segmented gossip protocol and feedback mechanism for in-cluster model exchange and gossip protocol for communication between clusters to make full use of bandwidth and have good training convergence. Experimental results demonstrate that our method has better performance with less communication resource consumption.","PeriodicalId":193890,"journal":{"name":"ICC 2022 - IEEE International Conference on Communications","volume":"20 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-05-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ICC 2022 - IEEE International Conference on Communications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICC45855.2022.9838880","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
In recent years, due to the increasing concern about data privacy security, federated learning, whose clients only synchronize the model rather than the personal data, has developed rapidly. However, the traditional federated learning system still has a high dependence on the central server, an unguaranteed enthusiasm of clients and reliability of the central server, and extremely high consumption of communication resources. Therefore, we propose Clustered Hierarchical Distributed Federated Learning to solve the above problems. We motivate the participation of clients by clustering and solve the dependence on the central server through distributed architecture. We apply a hierarchical segmented gossip protocol and feedback mechanism for in-cluster model exchange and gossip protocol for communication between clusters to make full use of bandwidth and have good training convergence. Experimental results demonstrate that our method has better performance with less communication resource consumption.