{"title":"具有局部动量的沟通高效联邦学习框架","authors":"Renyou Xie, Xiaojun Zhou","doi":"10.1109/HSI55341.2022.9869493","DOIUrl":null,"url":null,"abstract":"With the recent progress of AI, large amount of data generated in distributed Internet of Things (IoT) devices can be used to build different kinds of models that are helpful to improve people’s daily life. For example, language models can improve the speech recognition performance. Federated learning enables the distributed clients to jointly learn a model with data preserve in local, which provide a promising solution to leverage the massive data. However, in federated learning, the model learned by the local devices need to be repeatedly transmit to the server, which poses communication overhead. To tackle the communication issue, this paper proposes a communication efficient federated learning framework that utilize local momentum term to accelerate the convergence speed. Convergence guarantee under non-convex case is provided. Experiment on EMNIST and CIFAR10 dataset demonstrate that proposed method can effectively increase the convergence speed.","PeriodicalId":282607,"journal":{"name":"2022 15th International Conference on Human System Interaction (HSI)","volume":"47 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Communication Efficient Federated Learning Framework with Local Momentum\",\"authors\":\"Renyou Xie, Xiaojun Zhou\",\"doi\":\"10.1109/HSI55341.2022.9869493\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"With the recent progress of AI, large amount of data generated in distributed Internet of Things (IoT) devices can be used to build different kinds of models that are helpful to improve people’s daily life. For example, language models can improve the speech recognition performance. Federated learning enables the distributed clients to jointly learn a model with data preserve in local, which provide a promising solution to leverage the massive data. However, in federated learning, the model learned by the local devices need to be repeatedly transmit to the server, which poses communication overhead. To tackle the communication issue, this paper proposes a communication efficient federated learning framework that utilize local momentum term to accelerate the convergence speed. Convergence guarantee under non-convex case is provided. Experiment on EMNIST and CIFAR10 dataset demonstrate that proposed method can effectively increase the convergence speed.\",\"PeriodicalId\":282607,\"journal\":{\"name\":\"2022 15th International Conference on Human System Interaction (HSI)\",\"volume\":\"47 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-07-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 15th International Conference on Human System Interaction (HSI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/HSI55341.2022.9869493\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 15th International Conference on Human System Interaction (HSI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HSI55341.2022.9869493","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Communication Efficient Federated Learning Framework with Local Momentum
With the recent progress of AI, large amount of data generated in distributed Internet of Things (IoT) devices can be used to build different kinds of models that are helpful to improve people’s daily life. For example, language models can improve the speech recognition performance. Federated learning enables the distributed clients to jointly learn a model with data preserve in local, which provide a promising solution to leverage the massive data. However, in federated learning, the model learned by the local devices need to be repeatedly transmit to the server, which poses communication overhead. To tackle the communication issue, this paper proposes a communication efficient federated learning framework that utilize local momentum term to accelerate the convergence speed. Convergence guarantee under non-convex case is provided. Experiment on EMNIST and CIFAR10 dataset demonstrate that proposed method can effectively increase the convergence speed.