Clustered Hierarchical Distributed Federated Learning

Yan Gou, Ruiyu Wang, Zonghui Li, M. Imran, Lei Zhang
{"title":"Clustered Hierarchical Distributed Federated Learning","authors":"Yan Gou, Ruiyu Wang, Zonghui Li, M. Imran, Lei Zhang","doi":"10.1109/ICC45855.2022.9838880","DOIUrl":null,"url":null,"abstract":"In recent years, due to the increasing concern about data privacy security, federated learning, whose clients only synchronize the model rather than the personal data, has developed rapidly. However, the traditional federated learning system still has a high dependence on the central server, an unguaranteed enthusiasm of clients and reliability of the central server, and extremely high consumption of communication resources. Therefore, we propose Clustered Hierarchical Distributed Federated Learning to solve the above problems. We motivate the participation of clients by clustering and solve the dependence on the central server through distributed architecture. We apply a hierarchical segmented gossip protocol and feedback mechanism for in-cluster model exchange and gossip protocol for communication between clusters to make full use of bandwidth and have good training convergence. Experimental results demonstrate that our method has better performance with less communication resource consumption.","PeriodicalId":193890,"journal":{"name":"ICC 2022 - IEEE International Conference on Communications","volume":"20 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-05-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ICC 2022 - IEEE International Conference on Communications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICC45855.2022.9838880","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

In recent years, due to the increasing concern about data privacy security, federated learning, whose clients only synchronize the model rather than the personal data, has developed rapidly. However, the traditional federated learning system still has a high dependence on the central server, an unguaranteed enthusiasm of clients and reliability of the central server, and extremely high consumption of communication resources. Therefore, we propose Clustered Hierarchical Distributed Federated Learning to solve the above problems. We motivate the participation of clients by clustering and solve the dependence on the central server through distributed architecture. We apply a hierarchical segmented gossip protocol and feedback mechanism for in-cluster model exchange and gossip protocol for communication between clusters to make full use of bandwidth and have good training convergence. Experimental results demonstrate that our method has better performance with less communication resource consumption.
集群分层分布式联邦学习
近年来,由于人们对数据隐私安全的日益关注,客户端只同步模型而不同步个人数据的联邦学习得到了迅速发展。然而,传统的联邦学习系统对中心服务器的依赖程度仍然很高,客户端的积极性和中心服务器的可靠性得不到保证,通信资源的消耗也极高。因此,我们提出了聚类分层分布式联邦学习来解决上述问题。通过集群化激发客户端的参与,通过分布式架构解决对中心服务器的依赖。我们采用分层分段八卦协议和反馈机制进行簇内模型交换,采用八卦协议进行簇间通信,以充分利用带宽并具有良好的训练收敛性。实验结果表明,该方法具有较好的性能和较少的通信资源消耗。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信