Private and communication-efficient edge learning: a sparse differential gaussian-masking distributed SGD approach

Xin Zhang, Minghong Fang, Jia Liu, Zhengyuan Zhu
{"title":"Private and communication-efficient edge learning: a sparse differential gaussian-masking distributed SGD approach","authors":"Xin Zhang, Minghong Fang, Jia Liu, Zhengyuan Zhu","doi":"10.1145/3397166.3409123","DOIUrl":null,"url":null,"abstract":"With the rise of machine learning (ML) and the proliferation of smart mobile devices, recent years have witnessed a surge of interest in performing ML in wireless edge networks. In this paper, we consider the problem of jointly improving data privacy and communication efficiency of distributed edge learning, both of which are critical performance metrics in wireless edge network computing. Toward this end, we propose a new distributed stochastic gradient method with sparse differential Gaussian-masked stochastic gradients (SDM-DSGD) for non-convex distributed edge learning. Our main contributions are three-fold: i) We theoretically establish the privacy and communication efficiency performance guarantee for our SDM-DSGD method, which outperforms all existing works; ii) We propose a generalized differential-coded DSGD update, which enables a much lower transmit probability for gradient sparsification, and provides an [EQUATION] convergence rate; and iii) We reveal theoretical insights and offer practical design guidelines for the interactions between privacy preservation and communication efficiency - two conflicting performance goals. We conduct extensive experiments with a variety of learning models on MNIST and CIFAR-10 datasets to verify our theoretical findings.","PeriodicalId":122577,"journal":{"name":"Proceedings of the Twenty-First International Symposium on Theory, Algorithmic Foundations, and Protocol Design for Mobile Networks and Mobile Computing","volume":"17 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-01-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"17","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Twenty-First International Symposium on Theory, Algorithmic Foundations, and Protocol Design for Mobile Networks and Mobile Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3397166.3409123","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 17

Abstract

With the rise of machine learning (ML) and the proliferation of smart mobile devices, recent years have witnessed a surge of interest in performing ML in wireless edge networks. In this paper, we consider the problem of jointly improving data privacy and communication efficiency of distributed edge learning, both of which are critical performance metrics in wireless edge network computing. Toward this end, we propose a new distributed stochastic gradient method with sparse differential Gaussian-masked stochastic gradients (SDM-DSGD) for non-convex distributed edge learning. Our main contributions are three-fold: i) We theoretically establish the privacy and communication efficiency performance guarantee for our SDM-DSGD method, which outperforms all existing works; ii) We propose a generalized differential-coded DSGD update, which enables a much lower transmit probability for gradient sparsification, and provides an [EQUATION] convergence rate; and iii) We reveal theoretical insights and offer practical design guidelines for the interactions between privacy preservation and communication efficiency - two conflicting performance goals. We conduct extensive experiments with a variety of learning models on MNIST and CIFAR-10 datasets to verify our theoretical findings.
私有和通信高效边缘学习:稀疏微分高斯掩蔽分布式SGD方法
随着机器学习(ML)的兴起和智能移动设备的普及,近年来人们对在无线边缘网络中执行ML的兴趣激增。在本文中,我们考虑了分布式边缘学习的数据隐私和通信效率的共同提高问题,这两者都是无线边缘网络计算的关键性能指标。为此,我们提出了一种基于稀疏微分高斯掩码随机梯度(SDM-DSGD)的分布式随机梯度方法,用于非凸分布边缘学习。我们的主要贡献有三个方面:1)我们从理论上为我们的SDM-DSGD方法建立了隐私和通信效率的性能保证,优于所有现有的工作;ii)我们提出了一种广义微分编码DSGD更新,它使梯度稀疏化的传输概率低得多,并提供了一个[方程]收敛速率;iii)我们揭示了隐私保护和通信效率这两个相互冲突的性能目标之间的相互作用的理论见解,并提供了实用的设计指南。我们在MNIST和CIFAR-10数据集上使用各种学习模型进行了广泛的实验,以验证我们的理论发现。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信