{"title":"Differentially Private Distributed Optimization","authors":"Zhenqi Huang, S. Mitra, N. Vaidya","doi":"10.1145/2684464.2684480","DOIUrl":null,"url":null,"abstract":"In distributed optimization and iterative consensus literature, a standard problem is for N agents to minimize a function f over a subset of Euclidean space, where the cost function is expressed as a sum Σ fi. In this paper, we study the private distributed optimization problem (PDOP) with the additional requirement that the cost function of the individual agents should remain differentially private. The adversary attempts to infer information about the private cost functions from the messages that the agents exchange. Achieving differential privacy requires that any change of an individual's cost function only results in unsubstantial changes in the statistics of the messages. We propose a class of iterative algorithms for solving PDOP, which achieves differential privacy and convergence to a common value. Our analysis reveals the dependence of the achieved accuracy and the privacy levels on the the parameters of the algorithm. We observe that to achieve ε-differential privacy the accuracy of the algorithm has the order of O(1/ε2).","PeriodicalId":298587,"journal":{"name":"Proceedings of the 16th International Conference on Distributed Computing and Networking","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"214","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 16th International Conference on Distributed Computing and Networking","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2684464.2684480","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 214
Abstract
In distributed optimization and iterative consensus literature, a standard problem is for N agents to minimize a function f over a subset of Euclidean space, where the cost function is expressed as a sum Σ fi. In this paper, we study the private distributed optimization problem (PDOP) with the additional requirement that the cost function of the individual agents should remain differentially private. The adversary attempts to infer information about the private cost functions from the messages that the agents exchange. Achieving differential privacy requires that any change of an individual's cost function only results in unsubstantial changes in the statistics of the messages. We propose a class of iterative algorithms for solving PDOP, which achieves differential privacy and convergence to a common value. Our analysis reveals the dependence of the achieved accuracy and the privacy levels on the the parameters of the algorithm. We observe that to achieve ε-differential privacy the accuracy of the algorithm has the order of O(1/ε2).