Yao Pan, Zheng Chao, Wang He, Yang Jing, Li Hongjia, Wang Liming
{"title":"FedSHE:利用自适应分段 CKKS 同态加密技术保护隐私并提高联合学习效率","authors":"Yao Pan, Zheng Chao, Wang He, Yang Jing, Li Hongjia, Wang Liming","doi":"10.1186/s42400-024-00232-w","DOIUrl":null,"url":null,"abstract":"<p>Unprotected gradient exchange in federated learning (FL) systems may lead to gradient leakage-related attacks. CKKS is a promising approximate homomorphic encryption scheme to protect gradients, owing to its unique capability of performing operations directly on ciphertexts. However, configuring CKKS security parameters involves a trade-off between correctness, efficiency, and security. An evaluation gap exists regarding how these parameters impact computational performance. Additionally, the maximum vector length that CKKS can once encrypt, recommended by Homomorphic Encryption Standardization, is 16384, hampers its widespread adoption in FL when encrypting layers with numerous neurons. To protect gradients’ privacy in FL systems while maintaining practical performance, we comprehensively analyze the influence of security parameters such as polynomial modulus degree and coefficient modulus on homomorphic operations. Derived from our evaluation findings, we provide a method for selecting the optimal multiplication depth while meeting operational requirements. Then, we introduce an adaptive segmented encryption method tailored for CKKS, circumventing its encryption length constraint and enhancing its processing ability to encrypt neural network models. Finally, we present <i>FedSHE</i>, a privacy-preserving and efficient <i>Fed</i>erated learning scheme with adaptive <i>S</i>egmented CKKS <i>H</i>omomorphic <i>E</i>ncryption. <i>FedSHE</i> is implemented on top of the federated averaging (FedAvg) algorithm and is available at https://github.com/yooopan/FedSHE. Our evaluation results affirm the correctness and effectiveness of our proposed method, demonstrating that FedSHE outperforms existing homomorphic encryption-based federated learning research efforts in terms of model accuracy, computational efficiency, communication cost, and security level.</p>","PeriodicalId":36402,"journal":{"name":"Cybersecurity","volume":"15 1","pages":""},"PeriodicalIF":3.9000,"publicationDate":"2024-07-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"FedSHE: privacy preserving and efficient federated learning with adaptive segmented CKKS homomorphic encryption\",\"authors\":\"Yao Pan, Zheng Chao, Wang He, Yang Jing, Li Hongjia, Wang Liming\",\"doi\":\"10.1186/s42400-024-00232-w\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Unprotected gradient exchange in federated learning (FL) systems may lead to gradient leakage-related attacks. CKKS is a promising approximate homomorphic encryption scheme to protect gradients, owing to its unique capability of performing operations directly on ciphertexts. However, configuring CKKS security parameters involves a trade-off between correctness, efficiency, and security. An evaluation gap exists regarding how these parameters impact computational performance. Additionally, the maximum vector length that CKKS can once encrypt, recommended by Homomorphic Encryption Standardization, is 16384, hampers its widespread adoption in FL when encrypting layers with numerous neurons. To protect gradients’ privacy in FL systems while maintaining practical performance, we comprehensively analyze the influence of security parameters such as polynomial modulus degree and coefficient modulus on homomorphic operations. Derived from our evaluation findings, we provide a method for selecting the optimal multiplication depth while meeting operational requirements. Then, we introduce an adaptive segmented encryption method tailored for CKKS, circumventing its encryption length constraint and enhancing its processing ability to encrypt neural network models. Finally, we present <i>FedSHE</i>, a privacy-preserving and efficient <i>Fed</i>erated learning scheme with adaptive <i>S</i>egmented CKKS <i>H</i>omomorphic <i>E</i>ncryption. <i>FedSHE</i> is implemented on top of the federated averaging (FedAvg) algorithm and is available at https://github.com/yooopan/FedSHE. Our evaluation results affirm the correctness and effectiveness of our proposed method, demonstrating that FedSHE outperforms existing homomorphic encryption-based federated learning research efforts in terms of model accuracy, computational efficiency, communication cost, and security level.</p>\",\"PeriodicalId\":36402,\"journal\":{\"name\":\"Cybersecurity\",\"volume\":\"15 1\",\"pages\":\"\"},\"PeriodicalIF\":3.9000,\"publicationDate\":\"2024-07-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Cybersecurity\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1186/s42400-024-00232-w\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cybersecurity","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1186/s42400-024-00232-w","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
FedSHE: privacy preserving and efficient federated learning with adaptive segmented CKKS homomorphic encryption
Unprotected gradient exchange in federated learning (FL) systems may lead to gradient leakage-related attacks. CKKS is a promising approximate homomorphic encryption scheme to protect gradients, owing to its unique capability of performing operations directly on ciphertexts. However, configuring CKKS security parameters involves a trade-off between correctness, efficiency, and security. An evaluation gap exists regarding how these parameters impact computational performance. Additionally, the maximum vector length that CKKS can once encrypt, recommended by Homomorphic Encryption Standardization, is 16384, hampers its widespread adoption in FL when encrypting layers with numerous neurons. To protect gradients’ privacy in FL systems while maintaining practical performance, we comprehensively analyze the influence of security parameters such as polynomial modulus degree and coefficient modulus on homomorphic operations. Derived from our evaluation findings, we provide a method for selecting the optimal multiplication depth while meeting operational requirements. Then, we introduce an adaptive segmented encryption method tailored for CKKS, circumventing its encryption length constraint and enhancing its processing ability to encrypt neural network models. Finally, we present FedSHE, a privacy-preserving and efficient Federated learning scheme with adaptive Segmented CKKS Homomorphic Encryption. FedSHE is implemented on top of the federated averaging (FedAvg) algorithm and is available at https://github.com/yooopan/FedSHE. Our evaluation results affirm the correctness and effectiveness of our proposed method, demonstrating that FedSHE outperforms existing homomorphic encryption-based federated learning research efforts in terms of model accuracy, computational efficiency, communication cost, and security level.