Achieving Transparency Report Privacy in Linear Time

Chien-Lun Chen, L. Golubchik, R. Pal
{"title":"Achieving Transparency Report Privacy in Linear Time","authors":"Chien-Lun Chen, L. Golubchik, R. Pal","doi":"10.1145/3460001","DOIUrl":null,"url":null,"abstract":"An accountable algorithmic transparency report (ATR) should ideally investigate (a) transparency of the underlying algorithm, and (b) fairness of the algorithmic decisions, and at the same time preserve data subjects’ privacy. However, a provably formal study of the impact to data subjects’ privacy caused by the utility of releasing an ATR (that investigates transparency and fairness), has yet to be addressed in the literature. The far-fetched benefit of such a study lies in the methodical characterization of privacy-utility trade-offs for release of ATRs in public, and their consequential application-specific impact on the dimensions of society, politics, and economics. In this paper, we first investigate and demonstrate potential privacy hazards brought on by the deployment of transparency and fairness measures in released ATRs. To preserve data subjects’ privacy, we then propose a linear-time optimal-privacy scheme, built upon standard linear fractional programming (LFP) theory, for announcing ATRs, subject to constraints controlling the tolerance of privacy perturbation on the utility of transparency schemes. Subsequently, we quantify the privacy-utility trade-offs induced by our scheme, and analyze the impact of privacy perturbation on fairness measures in ATRs. To the best of our knowledge, this is the first analytical work that simultaneously addresses trade-offs between the triad of privacy, utility, and fairness, applicable to algorithmic transparency reports.","PeriodicalId":299504,"journal":{"name":"ACM Journal of Data and Information Quality (JDIQ)","volume":"28 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-03-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Journal of Data and Information Quality (JDIQ)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3460001","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

An accountable algorithmic transparency report (ATR) should ideally investigate (a) transparency of the underlying algorithm, and (b) fairness of the algorithmic decisions, and at the same time preserve data subjects’ privacy. However, a provably formal study of the impact to data subjects’ privacy caused by the utility of releasing an ATR (that investigates transparency and fairness), has yet to be addressed in the literature. The far-fetched benefit of such a study lies in the methodical characterization of privacy-utility trade-offs for release of ATRs in public, and their consequential application-specific impact on the dimensions of society, politics, and economics. In this paper, we first investigate and demonstrate potential privacy hazards brought on by the deployment of transparency and fairness measures in released ATRs. To preserve data subjects’ privacy, we then propose a linear-time optimal-privacy scheme, built upon standard linear fractional programming (LFP) theory, for announcing ATRs, subject to constraints controlling the tolerance of privacy perturbation on the utility of transparency schemes. Subsequently, we quantify the privacy-utility trade-offs induced by our scheme, and analyze the impact of privacy perturbation on fairness measures in ATRs. To the best of our knowledge, this is the first analytical work that simultaneously addresses trade-offs between the triad of privacy, utility, and fairness, applicable to algorithmic transparency reports.
在线性时间内实现透明度报告隐私
一份负责任的算法透明度报告(ATR)应该理想地调查(a)底层算法的透明度,(b)算法决策的公平性,同时保护数据主体的隐私。然而,关于发布ATR(调查透明度和公平性)的效用对数据主体隐私造成的影响的可证明的正式研究尚未在文献中得到解决。这样一项研究的牵强的好处在于,它系统地描述了在公开发布atr时隐私与效用之间的权衡,以及它们对社会、政治和经济维度的相应应用特定影响。在本文中,我们首先调查并证明了在发布的atr中部署透明和公平措施所带来的潜在隐私危害。为了保护数据主体的隐私,我们提出了一种基于标准线性分数规划(LFP)理论的线性时间最优隐私方案,用于宣布atr,并受到控制隐私扰动对透明度方案效用的容忍度的约束。随后,我们量化了该方案引起的隐私-效用权衡,并分析了隐私扰动对atr中公平度量的影响。据我们所知,这是第一个同时解决隐私、效用和公平性三者之间权衡的分析工作,适用于算法透明度报告。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信