云边缘网络通信的高效联邦学习方法

Jing Duan, Jie Duan, Xuefeng Wan, Yang Li
{"title":"云边缘网络通信的高效联邦学习方法","authors":"Jing Duan, Jie Duan, Xuefeng Wan, Yang Li","doi":"10.1109/CISCE58541.2023.10142819","DOIUrl":null,"url":null,"abstract":"Traditional federated learning (FL) decomposes the server's training tasks to the client for parallel learning, which increases the computational burden on the client and increases the communication overhead for model exchange between the server and the client. Although split federated learning divides the model and trains it on the client and server respectively, the computational pressure of client training is reduced, but the intermediate features and gradient information between the client and the server introduce additional communication costs. At the same time, if the number of clients is too large, the calculation pressure of the server-side parallel training model will also increase greatly. In this paper, we propose a split federated learning method based on client-side clustering, which reduces the additional communication between the client and the server by clustering the client according to the data distribution and compressing the activation vector of the split layer by vector quantization. It can also reduce the computing overhead of the server. The experimental results show that the method proposed in this paper can reduce the communication cost by 51.2% and the training time by 33.3% while maintaining the same accuracy.","PeriodicalId":145263,"journal":{"name":"2023 5th International Conference on Communications, Information System and Computer Engineering (CISCE)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-04-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Efficient Federated Learning Method for Cloud-Edge Network Communication\",\"authors\":\"Jing Duan, Jie Duan, Xuefeng Wan, Yang Li\",\"doi\":\"10.1109/CISCE58541.2023.10142819\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Traditional federated learning (FL) decomposes the server's training tasks to the client for parallel learning, which increases the computational burden on the client and increases the communication overhead for model exchange between the server and the client. Although split federated learning divides the model and trains it on the client and server respectively, the computational pressure of client training is reduced, but the intermediate features and gradient information between the client and the server introduce additional communication costs. At the same time, if the number of clients is too large, the calculation pressure of the server-side parallel training model will also increase greatly. In this paper, we propose a split federated learning method based on client-side clustering, which reduces the additional communication between the client and the server by clustering the client according to the data distribution and compressing the activation vector of the split layer by vector quantization. It can also reduce the computing overhead of the server. The experimental results show that the method proposed in this paper can reduce the communication cost by 51.2% and the training time by 33.3% while maintaining the same accuracy.\",\"PeriodicalId\":145263,\"journal\":{\"name\":\"2023 5th International Conference on Communications, Information System and Computer Engineering (CISCE)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-04-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 5th International Conference on Communications, Information System and Computer Engineering (CISCE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CISCE58541.2023.10142819\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 5th International Conference on Communications, Information System and Computer Engineering (CISCE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CISCE58541.2023.10142819","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

传统的联邦学习(FL)将服务器的训练任务分解到客户端进行并行学习,增加了客户端的计算负担,增加了服务器和客户端之间模型交换的通信开销。虽然分离联邦学习将模型分开,分别在客户端和服务器上进行训练,但减少了客户端训练的计算压力,但客户端和服务器之间的中间特征和梯度信息引入了额外的通信成本。同时,如果客户端数量过大,服务器端并行训练模型的计算压力也会大大增加。本文提出了一种基于客户端聚类的分离联邦学习方法,该方法根据数据分布对客户端进行聚类,并通过向量量化压缩分离层的激活向量,减少了客户端与服务器之间的额外通信。它还可以减少服务器的计算开销。实验结果表明,在保持相同准确率的情况下,本文提出的方法可将通信成本降低51.2%,将训练时间降低33.3%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Efficient Federated Learning Method for Cloud-Edge Network Communication
Traditional federated learning (FL) decomposes the server's training tasks to the client for parallel learning, which increases the computational burden on the client and increases the communication overhead for model exchange between the server and the client. Although split federated learning divides the model and trains it on the client and server respectively, the computational pressure of client training is reduced, but the intermediate features and gradient information between the client and the server introduce additional communication costs. At the same time, if the number of clients is too large, the calculation pressure of the server-side parallel training model will also increase greatly. In this paper, we propose a split federated learning method based on client-side clustering, which reduces the additional communication between the client and the server by clustering the client according to the data distribution and compressing the activation vector of the split layer by vector quantization. It can also reduce the computing overhead of the server. The experimental results show that the method proposed in this paper can reduce the communication cost by 51.2% and the training time by 33.3% while maintaining the same accuracy.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信