轨迹发布中保护敏感位置访问免受推理攻击

Xiangyu Liu, Manish Singh, Xiufeng Xia
{"title":"轨迹发布中保护敏感位置访问免受推理攻击","authors":"Xiangyu Liu, Manish Singh, Xiufeng Xia","doi":"10.1109/PDCAT46702.2019.00051","DOIUrl":null,"url":null,"abstract":"With an increasing popularity of Location-Based Services (LBSs), people's trajectories are continuously recorded and collected. The trajectory data are often shared or published for improving user experience, such as personalized recommendations and activity mining. However, releasing the trajectory data makes users' sensitive location visits vulnerable to inference attacks. In this paper, we study the problem of protecting sensitive location visits in the publication of trajectory data, assuming an adversary can do inference attacks using association rules derived from the data. We propose a methodology of anonymizing trajectories employing both generalizations and suppressions, to sanitize the trajectory data and protect sensitive location visits against inference attacks. We design a number of techniques to make our trajectory anonymizing algorithm efficient meanwhile maintaining the utility. We have conducted an empirical study to show that our algorithms can efficiently prevent inference attacks for real datasets while preserving the accuracy of aggregate querying on the published data.","PeriodicalId":166126,"journal":{"name":"2019 20th International Conference on Parallel and Distributed Computing, Applications and Technologies (PDCAT)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Protecting Sensitive Location Visits Against Inference Attacks in Trajectory Publishing\",\"authors\":\"Xiangyu Liu, Manish Singh, Xiufeng Xia\",\"doi\":\"10.1109/PDCAT46702.2019.00051\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"With an increasing popularity of Location-Based Services (LBSs), people's trajectories are continuously recorded and collected. The trajectory data are often shared or published for improving user experience, such as personalized recommendations and activity mining. However, releasing the trajectory data makes users' sensitive location visits vulnerable to inference attacks. In this paper, we study the problem of protecting sensitive location visits in the publication of trajectory data, assuming an adversary can do inference attacks using association rules derived from the data. We propose a methodology of anonymizing trajectories employing both generalizations and suppressions, to sanitize the trajectory data and protect sensitive location visits against inference attacks. We design a number of techniques to make our trajectory anonymizing algorithm efficient meanwhile maintaining the utility. We have conducted an empirical study to show that our algorithms can efficiently prevent inference attacks for real datasets while preserving the accuracy of aggregate querying on the published data.\",\"PeriodicalId\":166126,\"journal\":{\"name\":\"2019 20th International Conference on Parallel and Distributed Computing, Applications and Technologies (PDCAT)\",\"volume\":\"17 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 20th International Conference on Parallel and Distributed Computing, Applications and Technologies (PDCAT)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/PDCAT46702.2019.00051\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 20th International Conference on Parallel and Distributed Computing, Applications and Technologies (PDCAT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PDCAT46702.2019.00051","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

随着基于位置的服务(lbs)的日益普及,人们的轨迹被不断地记录和收集。轨迹数据经常被共享或发布,以改善用户体验,如个性化推荐和活动挖掘。然而,轨迹数据的释放使得用户的敏感位置访问容易受到推理攻击。在本文中,我们研究了在轨迹数据发布中保护敏感位置访问的问题,假设攻击者可以使用从数据中导出的关联规则进行推理攻击。我们提出了一种采用泛化和抑制的轨迹匿名化方法,以净化轨迹数据并保护敏感位置访问免受推理攻击。我们设计了一些技术,使我们的轨迹匿名算法在保持效用的同时高效。我们已经进行了一项实证研究,表明我们的算法可以有效地防止真实数据集的推理攻击,同时保持对已发布数据的聚合查询的准确性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Protecting Sensitive Location Visits Against Inference Attacks in Trajectory Publishing
With an increasing popularity of Location-Based Services (LBSs), people's trajectories are continuously recorded and collected. The trajectory data are often shared or published for improving user experience, such as personalized recommendations and activity mining. However, releasing the trajectory data makes users' sensitive location visits vulnerable to inference attacks. In this paper, we study the problem of protecting sensitive location visits in the publication of trajectory data, assuming an adversary can do inference attacks using association rules derived from the data. We propose a methodology of anonymizing trajectories employing both generalizations and suppressions, to sanitize the trajectory data and protect sensitive location visits against inference attacks. We design a number of techniques to make our trajectory anonymizing algorithm efficient meanwhile maintaining the utility. We have conducted an empirical study to show that our algorithms can efficiently prevent inference attacks for real datasets while preserving the accuracy of aggregate querying on the published data.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信