{"title":"Local Differential Privacy for Person-to-Person Interactions","authors":"Yuichi Sei;Akihiko Ohsuga","doi":"10.1109/OJCS.2022.3228999","DOIUrl":null,"url":null,"abstract":"Currently, many global organizations collect personal data for marketing, recommendation system improvement, and other purposes. Some organizations collect personal data securely based on a technique known as \n<inline-formula><tex-math>$\\epsilon$</tex-math></inline-formula>\n-local differential privacy (LDP). Under LDP, a privacy budget is allocated to each user in advance. Each time the user's data are collected, the user's privacy budget is consumed, and their privacy is protected by ensuring that the remaining privacy budget is greater than or equal to zero. Existing research and organizations assume that each individual's data are completely unrelated to other individuals' data. However, this assumption does not hold in a situation where interaction data between users are collected from them. In this case, each user's privacy is not sufficiently protected because the privacy budget is actually overspent. In this study, the issue of local differential privacy for person-to-person interactions is clarified. We propose a mechanism that satisfies LDP in a person-to-person interaction scenario. Mathematical analysis and experimental results show that the proposed mechanism can maintain high data utility while ensuring LDP compared to existing methods.","PeriodicalId":13205,"journal":{"name":"IEEE Open Journal of the Computer Society","volume":"3 ","pages":"304-312"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/iel7/8782664/9682503/09984836.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Open Journal of the Computer Society","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/9984836/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Currently, many global organizations collect personal data for marketing, recommendation system improvement, and other purposes. Some organizations collect personal data securely based on a technique known as
$\epsilon$
-local differential privacy (LDP). Under LDP, a privacy budget is allocated to each user in advance. Each time the user's data are collected, the user's privacy budget is consumed, and their privacy is protected by ensuring that the remaining privacy budget is greater than or equal to zero. Existing research and organizations assume that each individual's data are completely unrelated to other individuals' data. However, this assumption does not hold in a situation where interaction data between users are collected from them. In this case, each user's privacy is not sufficiently protected because the privacy budget is actually overspent. In this study, the issue of local differential privacy for person-to-person interactions is clarified. We propose a mechanism that satisfies LDP in a person-to-person interaction scenario. Mathematical analysis and experimental results show that the proposed mechanism can maintain high data utility while ensuring LDP compared to existing methods.