Data Compliance Utilization Method Based on Adaptive Differential Privacy and Federated Learning.

IF 6.4
Haiyan Kang, Bing Wu, Chong Zhang
{"title":"Data Compliance Utilization Method Based on Adaptive Differential Privacy and Federated Learning.","authors":"Haiyan Kang, Bing Wu, Chong Zhang","doi":"10.1142/S0129065725500601","DOIUrl":null,"url":null,"abstract":"<p><p>Federated learning (FL), as a method that coordinates multiple clients to train models together without handing over local data, is naturally privacy-preserving for data. However, there is still a risk that malicious attackers can steal intermediate parameters and infer the user's original data during the model training, thereby leaking sensitive data privacy. To address the above problems, we propose an adaptive differential privacy blockchain federated learning (ADP-BCFL) method to accomplish the compliant use of distributed data while ensuring security. First, utilize blockchain to accomplish secure storage and valid querying of user summary data. Second, propose an adaptive DP mechanism to be applied in the process of federal learning, which adaptively adjusts the threshold size of parameter tailoring according to the parameter characteristics, controls the amount of introduced noise, and ensures a good global model accuracy while effectively solving the problem of inference attack. Finally, the ADP-BCFL method was validated on the MNIST, Fashion MNIST datasets and spatiotemporal dataset to effectively balance model performance and privacy.</p>","PeriodicalId":94052,"journal":{"name":"International journal of neural systems","volume":" ","pages":"2550060"},"PeriodicalIF":6.4000,"publicationDate":"2025-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of neural systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1142/S0129065725500601","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Federated learning (FL), as a method that coordinates multiple clients to train models together without handing over local data, is naturally privacy-preserving for data. However, there is still a risk that malicious attackers can steal intermediate parameters and infer the user's original data during the model training, thereby leaking sensitive data privacy. To address the above problems, we propose an adaptive differential privacy blockchain federated learning (ADP-BCFL) method to accomplish the compliant use of distributed data while ensuring security. First, utilize blockchain to accomplish secure storage and valid querying of user summary data. Second, propose an adaptive DP mechanism to be applied in the process of federal learning, which adaptively adjusts the threshold size of parameter tailoring according to the parameter characteristics, controls the amount of introduced noise, and ensures a good global model accuracy while effectively solving the problem of inference attack. Finally, the ADP-BCFL method was validated on the MNIST, Fashion MNIST datasets and spatiotemporal dataset to effectively balance model performance and privacy.

基于自适应差分隐私和联邦学习的数据遵从性利用方法。
联邦学习(FL)作为一种协调多个客户端一起训练模型而不移交本地数据的方法,自然具有保护数据隐私的功能。但是,在模型训练过程中,仍然存在恶意攻击者窃取中间参数,推断用户原始数据,泄露敏感数据隐私的风险。为了解决上述问题,我们提出了一种自适应差分隐私区块链联邦学习(ADP-BCFL)方法,在保证安全性的同时实现分布式数据的合规使用。首先,利用区块链实现用户摘要数据的安全存储和有效查询。其次,提出了一种应用于联邦学习过程的自适应DP机制,根据参数特征自适应调整参数裁剪的阈值大小,控制引入噪声的数量,在保证良好的全局模型精度的同时,有效地解决了推理攻击问题。最后,在MNIST、Fashion MNIST数据集和时空数据集上验证了ADP-BCFL方法,有效地平衡了模型性能和隐私性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信