{"title":"Differentially Private Federated Learning with Drift Control","authors":"Wei-Ting Chang, Mohamed Seif, R. Tandon","doi":"10.1109/CISS53076.2022.9751200","DOIUrl":null,"url":null,"abstract":"In this paper, we consider the problem of differentially private federated learning with statistical data heterogeneity. More specifically, users collaborate with the parameter server (PS) to jointly train a machine learning model using their local datasets that are non-i.i.d. across users. The PS is assumed to be honest-but-curious so that the data at users need to be kept private from the PS. More specifically, interactions between the PS and users must satisfy differential privacy (DP) for each user. In this work, we propose a differentially private mechanism that simultaneously deals with user-drift caused by non-i.i.d. data and the randomized user participation in the training process. Specifically, we study SCAFFOLD, a popular federated learning algorithm, that has shown better performance on dealing with non-i.i.d. data than previous federated averaging algorithms. We study the convergence rate of SCAFFOLD under differential privacy constraint. Our convergence results take into account time-varying perturbation noises used by the users, and data and user sampling. We propose two time-varying noise allocation schemes in order to achieve better convergence rate and satisfy a total DP privacy budget. We also conduct experiments to confirm our theoretical findings on real world dataset.","PeriodicalId":305918,"journal":{"name":"2022 56th Annual Conference on Information Sciences and Systems (CISS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 56th Annual Conference on Information Sciences and Systems (CISS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CISS53076.2022.9751200","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In this paper, we consider the problem of differentially private federated learning with statistical data heterogeneity. More specifically, users collaborate with the parameter server (PS) to jointly train a machine learning model using their local datasets that are non-i.i.d. across users. The PS is assumed to be honest-but-curious so that the data at users need to be kept private from the PS. More specifically, interactions between the PS and users must satisfy differential privacy (DP) for each user. In this work, we propose a differentially private mechanism that simultaneously deals with user-drift caused by non-i.i.d. data and the randomized user participation in the training process. Specifically, we study SCAFFOLD, a popular federated learning algorithm, that has shown better performance on dealing with non-i.i.d. data than previous federated averaging algorithms. We study the convergence rate of SCAFFOLD under differential privacy constraint. Our convergence results take into account time-varying perturbation noises used by the users, and data and user sampling. We propose two time-varying noise allocation schemes in order to achieve better convergence rate and satisfy a total DP privacy budget. We also conduct experiments to confirm our theoretical findings on real world dataset.