Differentially Private Federated Learning with Drift Control

Wei-Ting Chang, Mohamed Seif, R. Tandon
{"title":"Differentially Private Federated Learning with Drift Control","authors":"Wei-Ting Chang, Mohamed Seif, R. Tandon","doi":"10.1109/CISS53076.2022.9751200","DOIUrl":null,"url":null,"abstract":"In this paper, we consider the problem of differentially private federated learning with statistical data heterogeneity. More specifically, users collaborate with the parameter server (PS) to jointly train a machine learning model using their local datasets that are non-i.i.d. across users. The PS is assumed to be honest-but-curious so that the data at users need to be kept private from the PS. More specifically, interactions between the PS and users must satisfy differential privacy (DP) for each user. In this work, we propose a differentially private mechanism that simultaneously deals with user-drift caused by non-i.i.d. data and the randomized user participation in the training process. Specifically, we study SCAFFOLD, a popular federated learning algorithm, that has shown better performance on dealing with non-i.i.d. data than previous federated averaging algorithms. We study the convergence rate of SCAFFOLD under differential privacy constraint. Our convergence results take into account time-varying perturbation noises used by the users, and data and user sampling. We propose two time-varying noise allocation schemes in order to achieve better convergence rate and satisfy a total DP privacy budget. We also conduct experiments to confirm our theoretical findings on real world dataset.","PeriodicalId":305918,"journal":{"name":"2022 56th Annual Conference on Information Sciences and Systems (CISS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 56th Annual Conference on Information Sciences and Systems (CISS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CISS53076.2022.9751200","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In this paper, we consider the problem of differentially private federated learning with statistical data heterogeneity. More specifically, users collaborate with the parameter server (PS) to jointly train a machine learning model using their local datasets that are non-i.i.d. across users. The PS is assumed to be honest-but-curious so that the data at users need to be kept private from the PS. More specifically, interactions between the PS and users must satisfy differential privacy (DP) for each user. In this work, we propose a differentially private mechanism that simultaneously deals with user-drift caused by non-i.i.d. data and the randomized user participation in the training process. Specifically, we study SCAFFOLD, a popular federated learning algorithm, that has shown better performance on dealing with non-i.i.d. data than previous federated averaging algorithms. We study the convergence rate of SCAFFOLD under differential privacy constraint. Our convergence results take into account time-varying perturbation noises used by the users, and data and user sampling. We propose two time-varying noise allocation schemes in order to achieve better convergence rate and satisfy a total DP privacy budget. We also conduct experiments to confirm our theoretical findings on real world dataset.
差分私有联邦学习与漂移控制
本文研究了统计数据异构的差分私有联邦学习问题。更具体地说,用户与参数服务器(PS)合作,使用非id的本地数据集共同训练机器学习模型。在用户。PS被认为是诚实但好奇的,因此用户的数据需要对PS保密。更具体地说,PS和用户之间的交互必须满足每个用户的差分隐私(DP)。在这项工作中,我们提出了一种差分私有机制,同时处理由非id引起的用户漂移。数据和随机用户参与培训过程。具体来说,我们研究了一种流行的联邦学习算法SCAFFOLD,它在处理非i.i.d方面表现出了更好的性能。数据比以前的联邦平均算法。研究了差分隐私约束下SCAFFOLD算法的收敛速度。我们的收敛结果考虑了用户使用的时变扰动噪声,以及数据和用户采样。我们提出了两种时变噪声分配方案,以获得更好的收敛速度和满足总DP隐私预算。我们也在真实世界的数据集上进行实验来验证我们的理论发现。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信