P2CEFL:利用稀疏梯度和抖动量化进行隐私保护和通信高效的联合学习

IF 7.7 2区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS
Gang Wang;Qi Qi;Rui Han;Lin Bai;Jinho Choi
{"title":"P2CEFL:利用稀疏梯度和抖动量化进行隐私保护和通信高效的联合学习","authors":"Gang Wang;Qi Qi;Rui Han;Lin Bai;Jinho Choi","doi":"10.1109/TMC.2024.3445957","DOIUrl":null,"url":null,"abstract":"Federated learning (FL) offers a promising framework for obtaining a global model by aggregating trained parameters from participating clients without transmitting their local private data. To further enhance privacy, differential privacy (DP)-based FL can be considered, wherein certain amounts of noise are added to the transmitting parameters, inevitably leading to a deterioration in communication efficiency. In this paper, we propose a novel Privacy-Preserving and Communication Efficient Federated Learning (P2CEFL) algorithm to reduce communication overhead under DP guarantee, utilizing sparse gradient and dithering quantization. Through gradient sparsification, the upload overhead for clients decreases considerably. Additionally, a subtractive dithering approach is employed to quantize sparse gradient, further reducing the bits for communication. We conduct theoretical analysis on privacy protection and convergence to verify the effectiveness of the proposed algorithm. Extensive numerical simulations show that the P2CEFL algorithm can achieve a similar level of model accuracy and significantly reduce communication costs compared to existing conventional DP-based FL methods.","PeriodicalId":50389,"journal":{"name":"IEEE Transactions on Mobile Computing","volume":"23 12","pages":"14722-14736"},"PeriodicalIF":7.7000,"publicationDate":"2024-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"P2CEFL: Privacy-Preserving and Communication Efficient Federated Learning With Sparse Gradient and Dithering Quantization\",\"authors\":\"Gang Wang;Qi Qi;Rui Han;Lin Bai;Jinho Choi\",\"doi\":\"10.1109/TMC.2024.3445957\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Federated learning (FL) offers a promising framework for obtaining a global model by aggregating trained parameters from participating clients without transmitting their local private data. To further enhance privacy, differential privacy (DP)-based FL can be considered, wherein certain amounts of noise are added to the transmitting parameters, inevitably leading to a deterioration in communication efficiency. In this paper, we propose a novel Privacy-Preserving and Communication Efficient Federated Learning (P2CEFL) algorithm to reduce communication overhead under DP guarantee, utilizing sparse gradient and dithering quantization. Through gradient sparsification, the upload overhead for clients decreases considerably. Additionally, a subtractive dithering approach is employed to quantize sparse gradient, further reducing the bits for communication. We conduct theoretical analysis on privacy protection and convergence to verify the effectiveness of the proposed algorithm. Extensive numerical simulations show that the P2CEFL algorithm can achieve a similar level of model accuracy and significantly reduce communication costs compared to existing conventional DP-based FL methods.\",\"PeriodicalId\":50389,\"journal\":{\"name\":\"IEEE Transactions on Mobile Computing\",\"volume\":\"23 12\",\"pages\":\"14722-14736\"},\"PeriodicalIF\":7.7000,\"publicationDate\":\"2024-08-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Mobile Computing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10640286/\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Mobile Computing","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10640286/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

摘要

联合学习(Federated Learning,FL)提供了一个前景广阔的框架,它可以在不传输本地隐私数据的情况下,通过聚合来自参与客户端的训练参数来获得全局模型。为了进一步提高隐私性,可以考虑基于差分隐私(DP)的联合学习,其中会在传输参数时添加一定量的噪声,这不可避免地会导致通信效率下降。在本文中,我们提出了一种新颖的隐私保护和通信效率联合学习(P2CEFL)算法,利用稀疏梯度和抖动量化来减少 DP 保证下的通信开销。通过梯度稀疏化,客户端的上传开销大大减少。此外,我们还采用了减法抖动方法对稀疏梯度进行量化,进一步减少了通信比特数。我们对隐私保护和收敛性进行了理论分析,以验证所提算法的有效性。大量的数值模拟表明,与现有的基于 DP 的传统 FL 方法相比,P2CEFL 算法可以达到类似的模型精度水平,并显著降低通信成本。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
P2CEFL: Privacy-Preserving and Communication Efficient Federated Learning With Sparse Gradient and Dithering Quantization
Federated learning (FL) offers a promising framework for obtaining a global model by aggregating trained parameters from participating clients without transmitting their local private data. To further enhance privacy, differential privacy (DP)-based FL can be considered, wherein certain amounts of noise are added to the transmitting parameters, inevitably leading to a deterioration in communication efficiency. In this paper, we propose a novel Privacy-Preserving and Communication Efficient Federated Learning (P2CEFL) algorithm to reduce communication overhead under DP guarantee, utilizing sparse gradient and dithering quantization. Through gradient sparsification, the upload overhead for clients decreases considerably. Additionally, a subtractive dithering approach is employed to quantize sparse gradient, further reducing the bits for communication. We conduct theoretical analysis on privacy protection and convergence to verify the effectiveness of the proposed algorithm. Extensive numerical simulations show that the P2CEFL algorithm can achieve a similar level of model accuracy and significantly reduce communication costs compared to existing conventional DP-based FL methods.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE Transactions on Mobile Computing
IEEE Transactions on Mobile Computing 工程技术-电信学
CiteScore
12.90
自引率
2.50%
发文量
403
审稿时长
6.6 months
期刊介绍: IEEE Transactions on Mobile Computing addresses key technical issues related to various aspects of mobile computing. This includes (a) architectures, (b) support services, (c) algorithm/protocol design and analysis, (d) mobile environments, (e) mobile communication systems, (f) applications, and (g) emerging technologies. Topics of interest span a wide range, covering aspects like mobile networks and hosts, mobility management, multimedia, operating system support, power management, online and mobile environments, security, scalability, reliability, and emerging technologies such as wearable computers, body area networks, and wireless sensor networks. The journal serves as a comprehensive platform for advancements in mobile computing research.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信