Efficient and Privacy-preserving Distributed Learning in Cloud-Edge Computing Systems

Yili Jiang, Kuan Zhang, Y. Qian, R. Hu
{"title":"Efficient and Privacy-preserving Distributed Learning in Cloud-Edge Computing Systems","authors":"Yili Jiang, Kuan Zhang, Y. Qian, R. Hu","doi":"10.1145/3468218.3469044","DOIUrl":null,"url":null,"abstract":"Machine learning and cloud computing have been integrated in diverse applications to provide intelligent services. With powerful computational ability, the cloud server can execute machine learning algorithm efficiently. However, since accurate machine learning highly depends on training the model with sufficient data. Transmitting massive raw data from distributed devices to the cloud leads to heavy communication overhead and privacy leakage. Distributed learning is a promising technique to reduce data transmission by allowing the distributed devices to participant in model training locally. Thus a global learning task can be performed in a distributed way. Although it avoids to disclose the participants' raw data to the cloud directly, the cloud can infer partial private information by analyzing their local models. To tackle this challenge, the state-of-the-art solutions mainly rely on encryption and differential privacy. In this paper, we propose to implement the distributed learning in a three-layer cloud-edge computing system. By applying the mini-batch gradient decent, we can decompose a learning task to distributed edge nodes and participants hierarchically. To improve the communication efficiency while preserving privacy, we employ secure aggregation protocol in small groups by utilizing the social network of participants. Simulation results are presented to show the effectiveness of our proposed scheme in terms of learning accuracy and efficiency.","PeriodicalId":318719,"journal":{"name":"Proceedings of the 3rd ACM Workshop on Wireless Security and Machine Learning","volume":"79 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 3rd ACM Workshop on Wireless Security and Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3468218.3469044","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Machine learning and cloud computing have been integrated in diverse applications to provide intelligent services. With powerful computational ability, the cloud server can execute machine learning algorithm efficiently. However, since accurate machine learning highly depends on training the model with sufficient data. Transmitting massive raw data from distributed devices to the cloud leads to heavy communication overhead and privacy leakage. Distributed learning is a promising technique to reduce data transmission by allowing the distributed devices to participant in model training locally. Thus a global learning task can be performed in a distributed way. Although it avoids to disclose the participants' raw data to the cloud directly, the cloud can infer partial private information by analyzing their local models. To tackle this challenge, the state-of-the-art solutions mainly rely on encryption and differential privacy. In this paper, we propose to implement the distributed learning in a three-layer cloud-edge computing system. By applying the mini-batch gradient decent, we can decompose a learning task to distributed edge nodes and participants hierarchically. To improve the communication efficiency while preserving privacy, we employ secure aggregation protocol in small groups by utilizing the social network of participants. Simulation results are presented to show the effectiveness of our proposed scheme in terms of learning accuracy and efficiency.
云边缘计算系统中高效且保护隐私的分布式学习
机器学习和云计算在多种应用中融合,提供智能服务。云服务器具有强大的计算能力,可以高效地执行机器学习算法。然而,由于准确的机器学习高度依赖于用足够的数据训练模型。将大量原始数据从分布式设备传输到云端会导致沉重的通信开销和隐私泄露。分布式学习是一种很有前途的技术,它通过允许分布式设备局部参与模型训练来减少数据传输。因此,全局学习任务可以以分布式的方式执行。虽然它避免了将参与者的原始数据直接透露给云,但云可以通过分析参与者的本地模型推断出部分私有信息。为了应对这一挑战,最先进的解决方案主要依赖于加密和差分隐私。本文提出在三层云边缘计算系统中实现分布式学习。通过应用小批量梯度体面,我们可以将学习任务分层分解为分布式的边缘节点和参与者。为了在保护隐私的同时提高通信效率,我们利用参与者的社交网络在小群体中采用安全聚合协议。仿真结果表明了该方法在学习精度和效率方面的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信