{"title":"云边缘网络通信的高效联邦学习方法","authors":"Jing Duan, Jie Duan, Xuefeng Wan, Yang Li","doi":"10.1109/CISCE58541.2023.10142819","DOIUrl":null,"url":null,"abstract":"Traditional federated learning (FL) decomposes the server's training tasks to the client for parallel learning, which increases the computational burden on the client and increases the communication overhead for model exchange between the server and the client. Although split federated learning divides the model and trains it on the client and server respectively, the computational pressure of client training is reduced, but the intermediate features and gradient information between the client and the server introduce additional communication costs. At the same time, if the number of clients is too large, the calculation pressure of the server-side parallel training model will also increase greatly. In this paper, we propose a split federated learning method based on client-side clustering, which reduces the additional communication between the client and the server by clustering the client according to the data distribution and compressing the activation vector of the split layer by vector quantization. It can also reduce the computing overhead of the server. The experimental results show that the method proposed in this paper can reduce the communication cost by 51.2% and the training time by 33.3% while maintaining the same accuracy.","PeriodicalId":145263,"journal":{"name":"2023 5th International Conference on Communications, Information System and Computer Engineering (CISCE)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-04-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Efficient Federated Learning Method for Cloud-Edge Network Communication\",\"authors\":\"Jing Duan, Jie Duan, Xuefeng Wan, Yang Li\",\"doi\":\"10.1109/CISCE58541.2023.10142819\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Traditional federated learning (FL) decomposes the server's training tasks to the client for parallel learning, which increases the computational burden on the client and increases the communication overhead for model exchange between the server and the client. Although split federated learning divides the model and trains it on the client and server respectively, the computational pressure of client training is reduced, but the intermediate features and gradient information between the client and the server introduce additional communication costs. At the same time, if the number of clients is too large, the calculation pressure of the server-side parallel training model will also increase greatly. In this paper, we propose a split federated learning method based on client-side clustering, which reduces the additional communication between the client and the server by clustering the client according to the data distribution and compressing the activation vector of the split layer by vector quantization. It can also reduce the computing overhead of the server. The experimental results show that the method proposed in this paper can reduce the communication cost by 51.2% and the training time by 33.3% while maintaining the same accuracy.\",\"PeriodicalId\":145263,\"journal\":{\"name\":\"2023 5th International Conference on Communications, Information System and Computer Engineering (CISCE)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-04-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 5th International Conference on Communications, Information System and Computer Engineering (CISCE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CISCE58541.2023.10142819\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 5th International Conference on Communications, Information System and Computer Engineering (CISCE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CISCE58541.2023.10142819","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Efficient Federated Learning Method for Cloud-Edge Network Communication
Traditional federated learning (FL) decomposes the server's training tasks to the client for parallel learning, which increases the computational burden on the client and increases the communication overhead for model exchange between the server and the client. Although split federated learning divides the model and trains it on the client and server respectively, the computational pressure of client training is reduced, but the intermediate features and gradient information between the client and the server introduce additional communication costs. At the same time, if the number of clients is too large, the calculation pressure of the server-side parallel training model will also increase greatly. In this paper, we propose a split federated learning method based on client-side clustering, which reduces the additional communication between the client and the server by clustering the client according to the data distribution and compressing the activation vector of the split layer by vector quantization. It can also reduce the computing overhead of the server. The experimental results show that the method proposed in this paper can reduce the communication cost by 51.2% and the training time by 33.3% while maintaining the same accuracy.