{"title":"非iid数据的联邦学习与服务器学习","authors":"V. Mai, R. La, Tao Zhang, Yuxuan Huang, A. Battou","doi":"10.1109/CISS56502.2023.10089643","DOIUrl":null,"url":null,"abstract":"Federated Learning (FL) has gained popularity as a means of distributed learning using local data samples at clients. However, recent studies showed that FL may experience slow learning and poor performance when client samples have different distributions. In this paper, we consider a server with access to a small dataset, on which it can perform its own learning. This approach is complementary to and can be combined with other approaches, e.g., sample sharing among clients. We study and demonstrate the benefits of proposed approach via experimental results obtained using two datasets - EMNIST and CIFAR10.","PeriodicalId":243775,"journal":{"name":"2023 57th Annual Conference on Information Sciences and Systems (CISS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Federated Learning With Server Learning for Non-IID Data\",\"authors\":\"V. Mai, R. La, Tao Zhang, Yuxuan Huang, A. Battou\",\"doi\":\"10.1109/CISS56502.2023.10089643\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Federated Learning (FL) has gained popularity as a means of distributed learning using local data samples at clients. However, recent studies showed that FL may experience slow learning and poor performance when client samples have different distributions. In this paper, we consider a server with access to a small dataset, on which it can perform its own learning. This approach is complementary to and can be combined with other approaches, e.g., sample sharing among clients. We study and demonstrate the benefits of proposed approach via experimental results obtained using two datasets - EMNIST and CIFAR10.\",\"PeriodicalId\":243775,\"journal\":{\"name\":\"2023 57th Annual Conference on Information Sciences and Systems (CISS)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-03-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 57th Annual Conference on Information Sciences and Systems (CISS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CISS56502.2023.10089643\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 57th Annual Conference on Information Sciences and Systems (CISS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CISS56502.2023.10089643","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Federated Learning With Server Learning for Non-IID Data
Federated Learning (FL) has gained popularity as a means of distributed learning using local data samples at clients. However, recent studies showed that FL may experience slow learning and poor performance when client samples have different distributions. In this paper, we consider a server with access to a small dataset, on which it can perform its own learning. This approach is complementary to and can be combined with other approaches, e.g., sample sharing among clients. We study and demonstrate the benefits of proposed approach via experimental results obtained using two datasets - EMNIST and CIFAR10.