Private Data Preprocessing for Privacy-preserving Federated Learning

Ruei-Hau Hsu, Ting-Yun Huang
{"title":"Private Data Preprocessing for Privacy-preserving Federated Learning","authors":"Ruei-Hau Hsu, Ting-Yun Huang","doi":"10.1109/ICKII55100.2022.9983518","DOIUrl":null,"url":null,"abstract":"Privacy-preserving federated learning can accomplish model aggregation without leaking to a local model and avoid the problem of sensitive data exposure caused by model leakage. Even though it protects privacy in the training process, any data analysis task proposed by the initiator and the type of data required for the task contains the research or trade secrets of the organization. Being intercepted in the transmission process or known by other data providers, the disclosure of essential research secrets occurs, leading to the theft of research or business ideas. Therefore, it is a critical issue to achieve data matching between the initiator and the participant under the premise of privacy protection. In this study, we propose a federated learning framework that considers the above security issues. A privacy-preserving federated learning architecture based on homomorphic encryption is designed to protect each participant's data and local model. In addition, encrypted query technology is used in this architecture to provide data privacy matching. The data provider searches the data in ciphertext, finds the encrypted data that meets the conditions, and completes the training process without disclosing any requirements of the task initiator.","PeriodicalId":352222,"journal":{"name":"2022 IEEE 5th International Conference on Knowledge Innovation and Invention (ICKII )","volume":"49 18","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE 5th International Conference on Knowledge Innovation and Invention (ICKII )","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICKII55100.2022.9983518","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Privacy-preserving federated learning can accomplish model aggregation without leaking to a local model and avoid the problem of sensitive data exposure caused by model leakage. Even though it protects privacy in the training process, any data analysis task proposed by the initiator and the type of data required for the task contains the research or trade secrets of the organization. Being intercepted in the transmission process or known by other data providers, the disclosure of essential research secrets occurs, leading to the theft of research or business ideas. Therefore, it is a critical issue to achieve data matching between the initiator and the participant under the premise of privacy protection. In this study, we propose a federated learning framework that considers the above security issues. A privacy-preserving federated learning architecture based on homomorphic encryption is designed to protect each participant's data and local model. In addition, encrypted query technology is used in this architecture to provide data privacy matching. The data provider searches the data in ciphertext, finds the encrypted data that meets the conditions, and completes the training process without disclosing any requirements of the task initiator.
隐私保护联邦学习的私有数据预处理
保持隐私的联邦学习可以在不泄漏到局部模型的情况下完成模型聚合,避免了模型泄漏带来的敏感数据暴露问题。即使在培训过程中保护隐私,但发起者提出的任何数据分析任务以及任务所需的数据类型都包含组织的研究或商业秘密。在传输过程中被截获或被其他数据提供者知道,发生重要的研究秘密泄露,导致研究或商业理念被盗。因此,如何在保护隐私的前提下实现发起者与参与者之间的数据匹配是一个关键问题。在本研究中,我们提出了一个考虑上述安全问题的联邦学习框架。设计了一种基于同态加密的保护隐私的联邦学习体系结构,以保护每个参与者的数据和本地模型。此外,该体系结构还采用了加密查询技术来提供数据隐私匹配。数据提供者以密文形式搜索数据,找到符合条件的加密数据,在不泄露任务发起者任何要求的情况下完成训练过程。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信