Scaling Data Analysis Services in an Edge-based Federated Learning Environment

Alessio Catalfamo, Lorenzo Carnevale, A. Galletta, Francesco Martella, A. Celesti, M. Fazio, M. Villari
{"title":"Scaling Data Analysis Services in an Edge-based Federated Learning Environment","authors":"Alessio Catalfamo, Lorenzo Carnevale, A. Galletta, Francesco Martella, A. Celesti, M. Fazio, M. Villari","doi":"10.1109/UCC56403.2022.00030","DOIUrl":null,"url":null,"abstract":"Federated Learning represents among the most important techniques used in recent years. It enables the training of Machine Learning-related models without sharing sensitive data. Federated Learning mainly exploits the Edge Computing paradigm for training data acquired from the surrounding environment. The solution proposed in this paper seeks to optimize all the processes involved within a Federated Learning client through transparent scaling across different devices. The proposed architecture and implementation abstracts the Federated Learning client architecture to create a transparent cluster that can optimize the complicated computation and aggregate the data to solve the heterogeneous distribution issue of the data in Federated Learning applications.","PeriodicalId":203244,"journal":{"name":"2022 IEEE/ACM 15th International Conference on Utility and Cloud Computing (UCC)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE/ACM 15th International Conference on Utility and Cloud Computing (UCC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/UCC56403.2022.00030","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Federated Learning represents among the most important techniques used in recent years. It enables the training of Machine Learning-related models without sharing sensitive data. Federated Learning mainly exploits the Edge Computing paradigm for training data acquired from the surrounding environment. The solution proposed in this paper seeks to optimize all the processes involved within a Federated Learning client through transparent scaling across different devices. The proposed architecture and implementation abstracts the Federated Learning client architecture to create a transparent cluster that can optimize the complicated computation and aggregate the data to solve the heterogeneous distribution issue of the data in Federated Learning applications.
在基于边缘的联邦学习环境中扩展数据分析服务
联邦学习是近年来最重要的技术之一。它可以在不共享敏感数据的情况下训练与机器学习相关的模型。联邦学习主要利用边缘计算范式来训练从周围环境中获取的数据。本文提出的解决方案旨在通过跨不同设备的透明扩展来优化联邦学习客户端中涉及的所有流程。本文提出的体系结构和实现将联邦学习客户端体系结构抽象为一个透明的集群,该集群可以优化复杂的计算并对数据进行聚合,从而解决联邦学习应用中数据的异构分布问题。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信