{"title":"Lay Importance on the Layer: Federated Learning for Non-IID data with Layer-based Regularization","authors":"Eleftherios Charteros, I. Koutsopoulos","doi":"10.1109/INFOCOMWKSHPS57453.2023.10226146","DOIUrl":null,"url":null,"abstract":"We propose Federated Learning with Layer-Adaptive Proximation (FedLap), a new class of algorithms to tackle non-IID data. The novelty is a new regularization term in the local (client) loss function which generalizes that of FedProx and captures divergence between global and local model weights of each client, at the level of Deep Neural Network (DNN) layers. Thus, the weights of different layers of the DNN are treated differently in the regularization function. Divergence between the global and local models is captured through a dissimilarity metric and a distance metric, both applied to each DNN layer. Since regularization is applied per layer and not to all weights as in FedProx, during local updates, only the weights of those layers that drift away from the global model change, while the other weights are not affected. Compared to FedAvg and FedProx, FedLap achieves 3–5 % higher accuracy in the first 20 communication rounds, and up to 10% higher accuracy in cases of unstable client participation. Thus, FedLap paves the way for a novel layer-aware class of algorithms in distributed learning.","PeriodicalId":354290,"journal":{"name":"IEEE INFOCOM 2023 - IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS)","volume":"90 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE INFOCOM 2023 - IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/INFOCOMWKSHPS57453.2023.10226146","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
We propose Federated Learning with Layer-Adaptive Proximation (FedLap), a new class of algorithms to tackle non-IID data. The novelty is a new regularization term in the local (client) loss function which generalizes that of FedProx and captures divergence between global and local model weights of each client, at the level of Deep Neural Network (DNN) layers. Thus, the weights of different layers of the DNN are treated differently in the regularization function. Divergence between the global and local models is captured through a dissimilarity metric and a distance metric, both applied to each DNN layer. Since regularization is applied per layer and not to all weights as in FedProx, during local updates, only the weights of those layers that drift away from the global model change, while the other weights are not affected. Compared to FedAvg and FedProx, FedLap achieves 3–5 % higher accuracy in the first 20 communication rounds, and up to 10% higher accuracy in cases of unstable client participation. Thus, FedLap paves the way for a novel layer-aware class of algorithms in distributed learning.