Differentially Private Algorithms for Statistical Verification of Cyber-Physical Systems

Yu Wang;Hussein Sibai;Mark Yen;Sayan Mitra;Geir E. Dullerud
{"title":"Differentially Private Algorithms for Statistical Verification of Cyber-Physical Systems","authors":"Yu Wang;Hussein Sibai;Mark Yen;Sayan Mitra;Geir E. Dullerud","doi":"10.1109/OJCSYS.2022.3207108","DOIUrl":null,"url":null,"abstract":"Statistical model checking is a class of sequential algorithms that can verify specifications of interest on an ensemble of cyber-physical systems (e.g., whether 99% of cars from a batch meet a requirement on their functionality). These algorithms infer the probability that given specifications are satisfied by the systems with provable statistical guarantees by drawing sufficient numbers of independent and identically distributed samples. During the process of statistical model checking, the values of the samples (e.g., a user's car trajectory) may be inferred by intruders, causing privacy concerns in consumer-level applications (e.g., automobiles and medical devices). This paper addresses the privacy of statistical model checking algorithms from the point of view of differential privacy. These algorithms are sequential, drawing samples until a condition on their values is met. We show that revealing the number of samples drawn can violate privacy. We also show that the standard exponential mechanism that randomizes the output of an algorithm to achieve differential privacy fails to do so in the context of sequential algorithms. Instead, we relax the conservative requirement in differential privacy that the sensitivity of the output of the algorithm should be bounded to any perturbation for any data set. We propose a new notion of differential privacy which we call \n<italic>expected differential privacy</i>\n (EDP). Then, we propose a novel expected sensitivity analysis for the sequential algorithm and propose a corresponding exponential mechanism that randomizes the termination time to achieve the EDP. We apply the proposed exponential mechanism to statistical model checking algorithms to preserve the privacy of the samples they draw. The utility of the proposed algorithm is demonstrated in a case study.","PeriodicalId":73299,"journal":{"name":"IEEE open journal of control systems","volume":"1 ","pages":"294-305"},"PeriodicalIF":0.0000,"publicationDate":"2022-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/iel7/9552933/9683993/09893303.pdf","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE open journal of control systems","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/9893303/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

Statistical model checking is a class of sequential algorithms that can verify specifications of interest on an ensemble of cyber-physical systems (e.g., whether 99% of cars from a batch meet a requirement on their functionality). These algorithms infer the probability that given specifications are satisfied by the systems with provable statistical guarantees by drawing sufficient numbers of independent and identically distributed samples. During the process of statistical model checking, the values of the samples (e.g., a user's car trajectory) may be inferred by intruders, causing privacy concerns in consumer-level applications (e.g., automobiles and medical devices). This paper addresses the privacy of statistical model checking algorithms from the point of view of differential privacy. These algorithms are sequential, drawing samples until a condition on their values is met. We show that revealing the number of samples drawn can violate privacy. We also show that the standard exponential mechanism that randomizes the output of an algorithm to achieve differential privacy fails to do so in the context of sequential algorithms. Instead, we relax the conservative requirement in differential privacy that the sensitivity of the output of the algorithm should be bounded to any perturbation for any data set. We propose a new notion of differential privacy which we call expected differential privacy (EDP). Then, we propose a novel expected sensitivity analysis for the sequential algorithm and propose a corresponding exponential mechanism that randomizes the termination time to achieve the EDP. We apply the proposed exponential mechanism to statistical model checking algorithms to preserve the privacy of the samples they draw. The utility of the proposed algorithm is demonstrated in a case study.
用于网络物理系统统计验证的差分私有算法
统计模型检查是一类序列算法,可以验证网络物理系统集成中感兴趣的规范(例如,一批中99%的汽车是否满足其功能要求)。这些算法通过绘制足够数量的独立且相同分布的样本来推断具有可证明统计保证的系统满足给定规范的概率。在统计模型检查过程中,入侵者可能会推断出样本的值(例如,用户的汽车轨迹),从而在消费者级应用程序(例如,汽车和医疗设备)中引起隐私问题。本文从差分隐私的角度讨论了统计模型检查算法的隐私问题。这些算法是连续的,绘制样本,直到满足其值的条件。我们表明,透露抽取的样本数量可能会侵犯隐私。我们还表明,对算法输出进行随机化以实现差分隐私的标准指数机制在序列算法的情况下无法做到这一点。相反,我们放宽了微分隐私中的保守要求,即算法输出的灵敏度应限制为任何数据集的任何扰动。我们提出了一个新的差分隐私概念,我们称之为期望差分隐私(EDP)。然后,我们提出了一种新的序列算法的预期灵敏度分析,并提出了相应的指数机制,该机制随机化终止时间以实现EDP。我们将所提出的指数机制应用于统计模型检查算法,以保护他们绘制的样本的隐私。通过一个实例验证了该算法的实用性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信