Multi-granularity Weighted Federated Learning in Heterogeneous Mobile Edge Computing Systems

Shangxuan Cai, Yunfeng Zhao, Zhicheng Liu, Chao Qiu, Xiaofei Wang, Qinghua Hu
{"title":"Multi-granularity Weighted Federated Learning in Heterogeneous Mobile Edge Computing Systems","authors":"Shangxuan Cai, Yunfeng Zhao, Zhicheng Liu, Chao Qiu, Xiaofei Wang, Qinghua Hu","doi":"10.1109/ICDCS54860.2022.00049","DOIUrl":null,"url":null,"abstract":"As a promising framework for distributed learning in mobile edge computing scenarios, federated learning (FL) allows multiple mobile devices to train a model collaboratively without transferring raw data and exposing user privacy. However, vanilla FL schemes are still facing to problems in edge computing, where the diversity of tasks and devices causes the non-IID and multi-granularity data with model heterogeneity. It becomes a pressing challenge to jointly training edge devices accompanied by these problems, while vanilla FL only discusses them separately. To this end, we consider tailoring FL to adapt to mobile edge environments, which focus on solving the problems of collaborative training of edge devices with multi-granularity heterogeneous models under different data distributions. In particular, we proposed a distance-based FL for the same type of edge devices that provides personalized models to avoid the negative impact of non-IID data on model aggregation. Further, we design a bi-directional guidance method with a prior attention mechanism, which can transfer knowledge among edge devices with multi-granulairty and multi-scale models. The experimental results show that our proposed mechanisms significantly improve training performance compared to other baselines on IID and non-IID data. Furthermore, the bi-directional guidance significantly improves convergence efficiency and accuracy performance for finer and coarser granularity edge devices, respectively.","PeriodicalId":225883,"journal":{"name":"2022 IEEE 42nd International Conference on Distributed Computing Systems (ICDCS)","volume":"76 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE 42nd International Conference on Distributed Computing Systems (ICDCS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDCS54860.2022.00049","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

As a promising framework for distributed learning in mobile edge computing scenarios, federated learning (FL) allows multiple mobile devices to train a model collaboratively without transferring raw data and exposing user privacy. However, vanilla FL schemes are still facing to problems in edge computing, where the diversity of tasks and devices causes the non-IID and multi-granularity data with model heterogeneity. It becomes a pressing challenge to jointly training edge devices accompanied by these problems, while vanilla FL only discusses them separately. To this end, we consider tailoring FL to adapt to mobile edge environments, which focus on solving the problems of collaborative training of edge devices with multi-granularity heterogeneous models under different data distributions. In particular, we proposed a distance-based FL for the same type of edge devices that provides personalized models to avoid the negative impact of non-IID data on model aggregation. Further, we design a bi-directional guidance method with a prior attention mechanism, which can transfer knowledge among edge devices with multi-granulairty and multi-scale models. The experimental results show that our proposed mechanisms significantly improve training performance compared to other baselines on IID and non-IID data. Furthermore, the bi-directional guidance significantly improves convergence efficiency and accuracy performance for finer and coarser granularity edge devices, respectively.
异构移动边缘计算系统中的多粒度加权联邦学习
作为移动边缘计算场景中分布式学习的一个有前途的框架,联邦学习(FL)允许多个移动设备协同训练模型,而无需传输原始数据和暴露用户隐私。然而,传统的FL方案在边缘计算中仍然面临着任务和设备的多样性导致非iid和多粒度数据具有模型异构性的问题。伴随着这些问题,联合训练边缘设备成为一个紧迫的挑战,而香草FL只是单独讨论。为此,我们考虑定制FL以适应移动边缘环境,重点解决不同数据分布下多粒度异构模型的边缘设备协同训练问题。特别地,我们提出了一种基于距离的FL,为相同类型的边缘设备提供个性化模型,以避免非iid数据对模型聚合的负面影响。在此基础上,设计了一种基于先验注意机制的双向引导方法,实现了多粒度、多尺度模型的边缘设备间知识转移。实验结果表明,与IID和非IID数据上的其他基准相比,我们提出的机制显著提高了训练性能。此外,双向制导显著提高了细粒度边缘器件的收敛效率和精度性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信