{"title":"通过共享边缘数据加速无线联合学习的收敛","authors":"Eunil Seo, Vinh Pham, E. Elmroth","doi":"10.1109/ICOIN56518.2023.10048937","DOIUrl":null,"url":null,"abstract":"Deploying federated learning (FL) over wireless mobile networks can be expensive because of the cost of wireless communication resources. Efforts have been made to reduce communication costs by accelerating model convergence, leading to the development of model-driven methods based on feature extraction, model-integrated algorithms, and client selection. However, the resulting performance gains are limited by the dependence of neural network convergence on input data quality. This work, therefore, investigates the use of marginal shared data (e.g., a single data entry) to accelerate model convergence and thereby reduce communication costs in FL. Experimental results show that sharing even a single piece of data can improve performance by 14.6% and reduce communication costs by 61.13% when using the federated averaging algorithm (FedAvg). Marginal data sharing could therefore be an attractive and practical solution in privacy-flexible environments or collaborative operational systems such as fog robotics and vehicles. Moreover, by assigning new labels to the shared data, it is possible to extend the number of classifying labels of an FL model even when the initial input datasets lack the labels in question.","PeriodicalId":285763,"journal":{"name":"2023 International Conference on Information Networking (ICOIN)","volume":"890 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Accelerating convergence in wireless federated learning by sharing marginal data\",\"authors\":\"Eunil Seo, Vinh Pham, E. Elmroth\",\"doi\":\"10.1109/ICOIN56518.2023.10048937\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Deploying federated learning (FL) over wireless mobile networks can be expensive because of the cost of wireless communication resources. Efforts have been made to reduce communication costs by accelerating model convergence, leading to the development of model-driven methods based on feature extraction, model-integrated algorithms, and client selection. However, the resulting performance gains are limited by the dependence of neural network convergence on input data quality. This work, therefore, investigates the use of marginal shared data (e.g., a single data entry) to accelerate model convergence and thereby reduce communication costs in FL. Experimental results show that sharing even a single piece of data can improve performance by 14.6% and reduce communication costs by 61.13% when using the federated averaging algorithm (FedAvg). Marginal data sharing could therefore be an attractive and practical solution in privacy-flexible environments or collaborative operational systems such as fog robotics and vehicles. Moreover, by assigning new labels to the shared data, it is possible to extend the number of classifying labels of an FL model even when the initial input datasets lack the labels in question.\",\"PeriodicalId\":285763,\"journal\":{\"name\":\"2023 International Conference on Information Networking (ICOIN)\",\"volume\":\"890 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-01-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 International Conference on Information Networking (ICOIN)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICOIN56518.2023.10048937\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Conference on Information Networking (ICOIN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICOIN56518.2023.10048937","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Accelerating convergence in wireless federated learning by sharing marginal data
Deploying federated learning (FL) over wireless mobile networks can be expensive because of the cost of wireless communication resources. Efforts have been made to reduce communication costs by accelerating model convergence, leading to the development of model-driven methods based on feature extraction, model-integrated algorithms, and client selection. However, the resulting performance gains are limited by the dependence of neural network convergence on input data quality. This work, therefore, investigates the use of marginal shared data (e.g., a single data entry) to accelerate model convergence and thereby reduce communication costs in FL. Experimental results show that sharing even a single piece of data can improve performance by 14.6% and reduce communication costs by 61.13% when using the federated averaging algorithm (FedAvg). Marginal data sharing could therefore be an attractive and practical solution in privacy-flexible environments or collaborative operational systems such as fog robotics and vehicles. Moreover, by assigning new labels to the shared data, it is possible to extend the number of classifying labels of an FL model even when the initial input datasets lack the labels in question.