Yange Chen , Baocang Wang , Hang Jiang , Pu Duan , Yuan Ping , Zhiyong Hong
{"title":"PEPFL: A framework for a practical and efficient privacy-preserving federated learning","authors":"Yange Chen , Baocang Wang , Hang Jiang , Pu Duan , Yuan Ping , Zhiyong Hong","doi":"10.1016/j.dcan.2022.05.019","DOIUrl":null,"url":null,"abstract":"<div><p>As an emerging joint learning model, federated learning is a promising way to combine model parameters of different users for training and inference without collecting users’ original data. However, a practical and efficient solution has not been established in previous work due to the absence of efficient matrix computation and cryptography schemes in the privacy-preserving federated learning model, especially in partially homomorphic cryptosystems. In this paper, we propose a Practical and Efficient Privacy-preserving Federated Learning (PEPFL) framework. First, we present a lifted distributed ElGamal cryptosystem for federated learning, which can solve the multi-key problem in federated learning. Secondly, we develop a Practical Partially Single Instruction Multiple Data (PSIMD) parallelism scheme that can encode a plaintext matrix into single plaintext for encryption, improving the encryption efficiency and reducing the communication cost in partially homomorphic cryptosystem. In addition, based on the Convolutional Neural Network (CNN) and the designed cryptosystem, a novel privacy-preserving federated learning framework is designed by using Momentum Gradient Descent (MGD). Finally, we evaluate the security and performance of PEPFL. The experiment results demonstrate that the scheme is practicable, effective, and secure with low communication and computation costs.</p></div>","PeriodicalId":48631,"journal":{"name":"Digital Communications and Networks","volume":null,"pages":null},"PeriodicalIF":7.5000,"publicationDate":"2024-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2352864822001122/pdfft?md5=dd5c0fce5bb76e999f88425fbe4e2caf&pid=1-s2.0-S2352864822001122-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Digital Communications and Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2352864822001122","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"TELECOMMUNICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
As an emerging joint learning model, federated learning is a promising way to combine model parameters of different users for training and inference without collecting users’ original data. However, a practical and efficient solution has not been established in previous work due to the absence of efficient matrix computation and cryptography schemes in the privacy-preserving federated learning model, especially in partially homomorphic cryptosystems. In this paper, we propose a Practical and Efficient Privacy-preserving Federated Learning (PEPFL) framework. First, we present a lifted distributed ElGamal cryptosystem for federated learning, which can solve the multi-key problem in federated learning. Secondly, we develop a Practical Partially Single Instruction Multiple Data (PSIMD) parallelism scheme that can encode a plaintext matrix into single plaintext for encryption, improving the encryption efficiency and reducing the communication cost in partially homomorphic cryptosystem. In addition, based on the Convolutional Neural Network (CNN) and the designed cryptosystem, a novel privacy-preserving federated learning framework is designed by using Momentum Gradient Descent (MGD). Finally, we evaluate the security and performance of PEPFL. The experiment results demonstrate that the scheme is practicable, effective, and secure with low communication and computation costs.
期刊介绍:
Digital Communications and Networks is a prestigious journal that emphasizes on communication systems and networks. We publish only top-notch original articles and authoritative reviews, which undergo rigorous peer-review. We are proud to announce that all our articles are fully Open Access and can be accessed on ScienceDirect. Our journal is recognized and indexed by eminent databases such as the Science Citation Index Expanded (SCIE) and Scopus.
In addition to regular articles, we may also consider exceptional conference papers that have been significantly expanded. Furthermore, we periodically release special issues that focus on specific aspects of the field.
In conclusion, Digital Communications and Networks is a leading journal that guarantees exceptional quality and accessibility for researchers and scholars in the field of communication systems and networks.