{"title":"联邦学习中全局训练的沟通高效方法","authors":"D. M. S. Bhatti, Muhammad Haris, Haewoon Nam","doi":"10.1109/ICTC55196.2022.9952661","DOIUrl":null,"url":null,"abstract":"Federated learning is a privacy preserving method of training the model on server by utilizing the end users' private data without accessing it. The central server shares the global model with all end users, called clients of the network. The clients are required to train the shared global model using their local datasets. The updated local trained models are forwarded back to the server to further update the global model. This process of training the global model is carried out for several rounds. The procedure of updating the local model and transmitting back to the server rises the communication cost. Since several clients are involved in training the global model, the aggregated communication cost of the network is escalated. This article proposes a communication effective aggregation method for federated learning, which considers the volume and variety of local clients' data before aggregation. The proposed approach is compared with the conventional methods and it achieves highest accuracy and minimum loss with respect to aggregated communication cost.","PeriodicalId":441404,"journal":{"name":"2022 13th International Conference on Information and Communication Technology Convergence (ICTC)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"A Communication Efficient Approach of Global Training in Federated Learning\",\"authors\":\"D. M. S. Bhatti, Muhammad Haris, Haewoon Nam\",\"doi\":\"10.1109/ICTC55196.2022.9952661\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Federated learning is a privacy preserving method of training the model on server by utilizing the end users' private data without accessing it. The central server shares the global model with all end users, called clients of the network. The clients are required to train the shared global model using their local datasets. The updated local trained models are forwarded back to the server to further update the global model. This process of training the global model is carried out for several rounds. The procedure of updating the local model and transmitting back to the server rises the communication cost. Since several clients are involved in training the global model, the aggregated communication cost of the network is escalated. This article proposes a communication effective aggregation method for federated learning, which considers the volume and variety of local clients' data before aggregation. The proposed approach is compared with the conventional methods and it achieves highest accuracy and minimum loss with respect to aggregated communication cost.\",\"PeriodicalId\":441404,\"journal\":{\"name\":\"2022 13th International Conference on Information and Communication Technology Convergence (ICTC)\",\"volume\":\"8 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-10-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 13th International Conference on Information and Communication Technology Convergence (ICTC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICTC55196.2022.9952661\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 13th International Conference on Information and Communication Technology Convergence (ICTC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICTC55196.2022.9952661","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A Communication Efficient Approach of Global Training in Federated Learning
Federated learning is a privacy preserving method of training the model on server by utilizing the end users' private data without accessing it. The central server shares the global model with all end users, called clients of the network. The clients are required to train the shared global model using their local datasets. The updated local trained models are forwarded back to the server to further update the global model. This process of training the global model is carried out for several rounds. The procedure of updating the local model and transmitting back to the server rises the communication cost. Since several clients are involved in training the global model, the aggregated communication cost of the network is escalated. This article proposes a communication effective aggregation method for federated learning, which considers the volume and variety of local clients' data before aggregation. The proposed approach is compared with the conventional methods and it achieves highest accuracy and minimum loss with respect to aggregated communication cost.