{"title":"Privacy-Preserving WSDM","authors":"A. Korolova","doi":"10.1145/3289600.3291385","DOIUrl":null,"url":null,"abstract":"The goals of learning from user data and preserving user privacy are often considered to be in conflict. This presentation will demonstrate that there are contexts when provable privacy guarantees can be an enabler for better web search and data mining (WSDM), and can empower researchers hoping to change the world by mining sensitive user data. The presentation starts by motivating the rigorous statistical data privacy definition that is particularly suitable for today's world of big data, differential privacy. It will then demonstrate how to achieve differential privacy for WSDM tasks when the data collector is trusted by the users. Using Chrome's deployment of RAPPOR as a case study, it will be shown that achieving differential privacy while preserving utility is feasible even when the data collector is not trusted. The presentation concludes with open problems and challenges for the WSDM community.","PeriodicalId":143253,"journal":{"name":"Proceedings of the Twelfth ACM International Conference on Web Search and Data Mining","volume":"4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-01-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Twelfth ACM International Conference on Web Search and Data Mining","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3289600.3291385","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
The goals of learning from user data and preserving user privacy are often considered to be in conflict. This presentation will demonstrate that there are contexts when provable privacy guarantees can be an enabler for better web search and data mining (WSDM), and can empower researchers hoping to change the world by mining sensitive user data. The presentation starts by motivating the rigorous statistical data privacy definition that is particularly suitable for today's world of big data, differential privacy. It will then demonstrate how to achieve differential privacy for WSDM tasks when the data collector is trusted by the users. Using Chrome's deployment of RAPPOR as a case study, it will be shown that achieving differential privacy while preserving utility is feasible even when the data collector is not trusted. The presentation concludes with open problems and challenges for the WSDM community.