基于同态加密模型的安全联邦学习

Nadia Hussien, Nadia Mahmood Hussien, Saba Abdulbaqi Salman, Mohammad Aljanabi
{"title":"基于同态加密模型的安全联邦学习","authors":"Nadia Hussien, Nadia Mahmood Hussien, Saba Abdulbaqi Salman, Mohammad Aljanabi","doi":"10.47667/ijpasr.v4i3.235","DOIUrl":null,"url":null,"abstract":"Federated learning (FL) offers collaborative machine learning across decentralized devices while safeguarding data privacy. However, data security and privacy remain key concerns. This paper introduces \"Secure Federated Learning with a Homomorphic Encryption Model,\" addressing these challenges by integrating homomorphic encryption into FL. The model starts by initializing a global machine learning model and generating a homomorphic encryption key pair, with the public key shared among FL participants. Using this public key, participants then collect, preprocess, and encrypt their local data. During FL Training Rounds, participants decrypt the global model, compute local updates on encrypted data, encrypt these updates, and securely send them to the aggregator. The aggregator homomorphic ally combines updates without revealing participant data, forwarding the encrypted aggregated update to the global model owner. The Global Model Update ensures the owner decrypts the aggregated update using the private key, updates the global model, encrypts it with the public key, and shares the encrypted global model with FL participants. With optional model evaluation, training can iterate for several rounds or until convergence. This model offers a robust solution to Florida data privacy and security issues, with versatile applications across domains. This paper presents core model components, advantages, and potential domain-specific implementations while making significant strides in addressing FL's data privacy concerns.","PeriodicalId":14397,"journal":{"name":"International Journal Papier Advance and Scientific Review","volume":"131 3","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Secure Federated Learning with a Homomorphic Encryption Model\",\"authors\":\"Nadia Hussien, Nadia Mahmood Hussien, Saba Abdulbaqi Salman, Mohammad Aljanabi\",\"doi\":\"10.47667/ijpasr.v4i3.235\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Federated learning (FL) offers collaborative machine learning across decentralized devices while safeguarding data privacy. However, data security and privacy remain key concerns. This paper introduces \\\"Secure Federated Learning with a Homomorphic Encryption Model,\\\" addressing these challenges by integrating homomorphic encryption into FL. The model starts by initializing a global machine learning model and generating a homomorphic encryption key pair, with the public key shared among FL participants. Using this public key, participants then collect, preprocess, and encrypt their local data. During FL Training Rounds, participants decrypt the global model, compute local updates on encrypted data, encrypt these updates, and securely send them to the aggregator. The aggregator homomorphic ally combines updates without revealing participant data, forwarding the encrypted aggregated update to the global model owner. The Global Model Update ensures the owner decrypts the aggregated update using the private key, updates the global model, encrypts it with the public key, and shares the encrypted global model with FL participants. With optional model evaluation, training can iterate for several rounds or until convergence. This model offers a robust solution to Florida data privacy and security issues, with versatile applications across domains. This paper presents core model components, advantages, and potential domain-specific implementations while making significant strides in addressing FL's data privacy concerns.\",\"PeriodicalId\":14397,\"journal\":{\"name\":\"International Journal Papier Advance and Scientific Review\",\"volume\":\"131 3\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal Papier Advance and Scientific Review\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.47667/ijpasr.v4i3.235\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal Papier Advance and Scientific Review","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.47667/ijpasr.v4i3.235","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

联邦学习(FL)提供跨分散设备的协作机器学习,同时保护数据隐私。然而,数据安全和隐私仍然是关键问题。本文介绍了“具有同态加密模型的安全联邦学习”,通过将同态加密集成到FL中来解决这些挑战。该模型首先初始化一个全局机器学习模型并生成一个同态加密密钥对,并在FL参与者之间共享公钥。然后,参与者使用这个公钥收集、预处理和加密他们的本地数据。在FL训练回合中,参与者解密全局模型,计算加密数据的本地更新,加密这些更新,并安全地将它们发送给聚合器。聚合器同态地组合更新而不泄露参与者数据,将加密的聚合更新转发给全局模型所有者。全局模型更新确保所有者使用私钥解密聚合的更新,更新全局模型,使用公钥对其加密,并与FL参与者共享加密的全局模型。使用可选的模型评估,训练可以迭代几轮或直到收敛。该模型为佛罗里达州的数据隐私和安全问题提供了一个强大的解决方案,具有跨域的通用应用程序。本文介绍了核心模型组件、优势和潜在的特定领域实现,同时在解决FL的数据隐私问题方面取得了重大进展。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Secure Federated Learning with a Homomorphic Encryption Model
Federated learning (FL) offers collaborative machine learning across decentralized devices while safeguarding data privacy. However, data security and privacy remain key concerns. This paper introduces "Secure Federated Learning with a Homomorphic Encryption Model," addressing these challenges by integrating homomorphic encryption into FL. The model starts by initializing a global machine learning model and generating a homomorphic encryption key pair, with the public key shared among FL participants. Using this public key, participants then collect, preprocess, and encrypt their local data. During FL Training Rounds, participants decrypt the global model, compute local updates on encrypted data, encrypt these updates, and securely send them to the aggregator. The aggregator homomorphic ally combines updates without revealing participant data, forwarding the encrypted aggregated update to the global model owner. The Global Model Update ensures the owner decrypts the aggregated update using the private key, updates the global model, encrypts it with the public key, and shares the encrypted global model with FL participants. With optional model evaluation, training can iterate for several rounds or until convergence. This model offers a robust solution to Florida data privacy and security issues, with versatile applications across domains. This paper presents core model components, advantages, and potential domain-specific implementations while making significant strides in addressing FL's data privacy concerns.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信