PEPFL: A framework for a practical and efficient privacy-preserving federated learning

IF 7.5 2区 计算机科学 Q1 TELECOMMUNICATIONS
Yange Chen , Baocang Wang , Hang Jiang , Pu Duan , Yuan Ping , Zhiyong Hong
{"title":"PEPFL: A framework for a practical and efficient privacy-preserving federated learning","authors":"Yange Chen ,&nbsp;Baocang Wang ,&nbsp;Hang Jiang ,&nbsp;Pu Duan ,&nbsp;Yuan Ping ,&nbsp;Zhiyong Hong","doi":"10.1016/j.dcan.2022.05.019","DOIUrl":null,"url":null,"abstract":"<div><p>As an emerging joint learning model, federated learning is a promising way to combine model parameters of different users for training and inference without collecting users’ original data. However, a practical and efficient solution has not been established in previous work due to the absence of efficient matrix computation and cryptography schemes in the privacy-preserving federated learning model, especially in partially homomorphic cryptosystems. In this paper, we propose a Practical and Efficient Privacy-preserving Federated Learning (PEPFL) framework. First, we present a lifted distributed ElGamal cryptosystem for federated learning, which can solve the multi-key problem in federated learning. Secondly, we develop a Practical Partially Single Instruction Multiple Data (PSIMD) parallelism scheme that can encode a plaintext matrix into single plaintext for encryption, improving the encryption efficiency and reducing the communication cost in partially homomorphic cryptosystem. In addition, based on the Convolutional Neural Network (CNN) and the designed cryptosystem, a novel privacy-preserving federated learning framework is designed by using Momentum Gradient Descent (MGD). Finally, we evaluate the security and performance of PEPFL. The experiment results demonstrate that the scheme is practicable, effective, and secure with low communication and computation costs.</p></div>","PeriodicalId":48631,"journal":{"name":"Digital Communications and Networks","volume":null,"pages":null},"PeriodicalIF":7.5000,"publicationDate":"2024-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2352864822001122/pdfft?md5=dd5c0fce5bb76e999f88425fbe4e2caf&pid=1-s2.0-S2352864822001122-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Digital Communications and Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2352864822001122","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"TELECOMMUNICATIONS","Score":null,"Total":0}
引用次数: 0

Abstract

As an emerging joint learning model, federated learning is a promising way to combine model parameters of different users for training and inference without collecting users’ original data. However, a practical and efficient solution has not been established in previous work due to the absence of efficient matrix computation and cryptography schemes in the privacy-preserving federated learning model, especially in partially homomorphic cryptosystems. In this paper, we propose a Practical and Efficient Privacy-preserving Federated Learning (PEPFL) framework. First, we present a lifted distributed ElGamal cryptosystem for federated learning, which can solve the multi-key problem in federated learning. Secondly, we develop a Practical Partially Single Instruction Multiple Data (PSIMD) parallelism scheme that can encode a plaintext matrix into single plaintext for encryption, improving the encryption efficiency and reducing the communication cost in partially homomorphic cryptosystem. In addition, based on the Convolutional Neural Network (CNN) and the designed cryptosystem, a novel privacy-preserving federated learning framework is designed by using Momentum Gradient Descent (MGD). Finally, we evaluate the security and performance of PEPFL. The experiment results demonstrate that the scheme is practicable, effective, and secure with low communication and computation costs.

PEPFL:实用高效的隐私保护联合学习框架
作为一种新兴的联合学习模型,联合学习是在不收集用户原始数据的情况下结合不同用户的模型参数进行训练和推理的一种有前途的方法。然而,由于在保护隐私的联合学习模型中缺乏高效的矩阵计算和加密方案,特别是在部分同态加密系统中,以往的工作还没有建立起实用高效的解决方案。在本文中,我们提出了一个实用高效的隐私保护联合学习(PEPFL)框架。首先,我们提出了一种用于联合学习的提升分布式 ElGamal 密码系统,它可以解决联合学习中的多密钥问题。其次,我们开发了一种实用的部分单指令多数据(PSIMD)并行方案,可以将明文矩阵编码成单个明文进行加密,提高了部分同态加密系统的加密效率,降低了通信成本。此外,基于卷积神经网络(CNN)和所设计的密码系统,我们利用矩量梯度下降法(MGD)设计了一种新型的隐私保护联合学习框架。最后,我们评估了 PEPFL 的安全性和性能。实验结果表明,该方案实用、有效、安全,且通信和计算成本较低。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Digital Communications and Networks
Digital Communications and Networks Computer Science-Hardware and Architecture
CiteScore
12.80
自引率
5.10%
发文量
915
审稿时长
30 weeks
期刊介绍: Digital Communications and Networks is a prestigious journal that emphasizes on communication systems and networks. We publish only top-notch original articles and authoritative reviews, which undergo rigorous peer-review. We are proud to announce that all our articles are fully Open Access and can be accessed on ScienceDirect. Our journal is recognized and indexed by eminent databases such as the Science Citation Index Expanded (SCIE) and Scopus. In addition to regular articles, we may also consider exceptional conference papers that have been significantly expanded. Furthermore, we periodically release special issues that focus on specific aspects of the field. In conclusion, Digital Communications and Networks is a leading journal that guarantees exceptional quality and accessibility for researchers and scholars in the field of communication systems and networks.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信