Lay Importance on the Layer: Federated Learning for Non-IID data with Layer-based Regularization

Eleftherios Charteros, I. Koutsopoulos
{"title":"Lay Importance on the Layer: Federated Learning for Non-IID data with Layer-based Regularization","authors":"Eleftherios Charteros, I. Koutsopoulos","doi":"10.1109/INFOCOMWKSHPS57453.2023.10226146","DOIUrl":null,"url":null,"abstract":"We propose Federated Learning with Layer-Adaptive Proximation (FedLap), a new class of algorithms to tackle non-IID data. The novelty is a new regularization term in the local (client) loss function which generalizes that of FedProx and captures divergence between global and local model weights of each client, at the level of Deep Neural Network (DNN) layers. Thus, the weights of different layers of the DNN are treated differently in the regularization function. Divergence between the global and local models is captured through a dissimilarity metric and a distance metric, both applied to each DNN layer. Since regularization is applied per layer and not to all weights as in FedProx, during local updates, only the weights of those layers that drift away from the global model change, while the other weights are not affected. Compared to FedAvg and FedProx, FedLap achieves 3–5 % higher accuracy in the first 20 communication rounds, and up to 10% higher accuracy in cases of unstable client participation. Thus, FedLap paves the way for a novel layer-aware class of algorithms in distributed learning.","PeriodicalId":354290,"journal":{"name":"IEEE INFOCOM 2023 - IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS)","volume":"90 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE INFOCOM 2023 - IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/INFOCOMWKSHPS57453.2023.10226146","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

We propose Federated Learning with Layer-Adaptive Proximation (FedLap), a new class of algorithms to tackle non-IID data. The novelty is a new regularization term in the local (client) loss function which generalizes that of FedProx and captures divergence between global and local model weights of each client, at the level of Deep Neural Network (DNN) layers. Thus, the weights of different layers of the DNN are treated differently in the regularization function. Divergence between the global and local models is captured through a dissimilarity metric and a distance metric, both applied to each DNN layer. Since regularization is applied per layer and not to all weights as in FedProx, during local updates, only the weights of those layers that drift away from the global model change, while the other weights are not affected. Compared to FedAvg and FedProx, FedLap achieves 3–5 % higher accuracy in the first 20 communication rounds, and up to 10% higher accuracy in cases of unstable client participation. Thus, FedLap paves the way for a novel layer-aware class of algorithms in distributed learning.
重视层:基于层的正则化的非iid数据的联邦学习
我们提出了带有层自适应近似的联邦学习(FedLap),这是一种处理非iid数据的新算法。新新性是局部(客户端)损失函数中的一个新的正则化项,它推广了FedProx的损失函数,并在深度神经网络(DNN)层上捕获每个客户端的全局和局部模型权重之间的差异。因此,DNN的不同层的权值在正则化函数中被不同地处理。通过应用于每个DNN层的不相似度度量和距离度量来捕获全局模型和局部模型之间的差异。由于正则化应用于每层,而不是像FedProx中那样应用于所有权重,因此在局部更新期间,只有偏离全局模型的那些层的权重发生变化,而其他权重不受影响。与fedag和FedProx相比,FedLap在前20轮通信中准确率提高了3 - 5%,在客户参与不稳定的情况下准确率提高了10%。因此,FedLap为分布式学习中新颖的分层感知算法类铺平了道路。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信