{"title":"Privacy-preserving release of aggregate dynamic models","authors":"J. L. Ny, George J. Pappas","doi":"10.1145/2461446.2461454","DOIUrl":null,"url":null,"abstract":"New solutions proposed for the monitoring and control of large-scale systems increasingly rely on sensitive data provided by end-users. As a result, there is a need to provide guarantees that these systems do not unintentionally leak private and confidential information during their operation. Motivated by this context, this paper discusses the problem of releasing a dynamic model describing the aggregate input-output dynamics of an ensemble of subsystems coupled via a common input and output, while controlling the amount of information that an adversary can infer about the dynamics of the individual subsystems. Such a model can then be used as an approximation of the true system, e.g., for controller design purposes. The proposed schemes rely on the notion of differential privacy, which provides strong and quantitative privacy guarantees that can be used by individuals to evaluate the risk/reward trade-offs involved in releasing detailed information about their behavior.","PeriodicalId":203753,"journal":{"name":"International Conference on High Confidence Networked Systems","volume":"28 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-04-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"12","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Conference on High Confidence Networked Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2461446.2461454","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 12
Abstract
New solutions proposed for the monitoring and control of large-scale systems increasingly rely on sensitive data provided by end-users. As a result, there is a need to provide guarantees that these systems do not unintentionally leak private and confidential information during their operation. Motivated by this context, this paper discusses the problem of releasing a dynamic model describing the aggregate input-output dynamics of an ensemble of subsystems coupled via a common input and output, while controlling the amount of information that an adversary can infer about the dynamics of the individual subsystems. Such a model can then be used as an approximation of the true system, e.g., for controller design purposes. The proposed schemes rely on the notion of differential privacy, which provides strong and quantitative privacy guarantees that can be used by individuals to evaluate the risk/reward trade-offs involved in releasing detailed information about their behavior.