{"title":"基于贝叶斯估计的gnn链路局部差分隐私","authors":"Xiaochen Zhu","doi":"10.1145/3555041.3589398","DOIUrl":null,"url":null,"abstract":"Recent years have witnessed the emergence of graph neural networks (GNNs) and an increasing amount of attention on GNNs from the data management community. Yet, training GNNs may raise privacy concerns as they may reveal sensitive information that must be kept private according to laws. In this paper, we study GNNs with link local differential privacy over decentralized nodes, where an untrusted server collaborates with node clients to train a GNN model without revealing the existence of any link. We find that by spending the privacy budget independently on links and degrees of the graph, the server can use Bayesian estimation to better denoise the graph topology. Unlike existing approaches, our mechanism does not aim to preserve graph density, but allows the server to estimate fewer links under lower privacy budget and higher uncertainty. Hence, the server makes fewer false positive link estimations and trains better models. Finally, we conduct extensive experiments to demonstrate that our method achieves considerably better performance with higher accuracy under same privacy budget compared to existing approaches.","PeriodicalId":161812,"journal":{"name":"Companion of the 2023 International Conference on Management of Data","volume":"33 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Link Local Differential Privacy in GNNs via Bayesian Estimation\",\"authors\":\"Xiaochen Zhu\",\"doi\":\"10.1145/3555041.3589398\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Recent years have witnessed the emergence of graph neural networks (GNNs) and an increasing amount of attention on GNNs from the data management community. Yet, training GNNs may raise privacy concerns as they may reveal sensitive information that must be kept private according to laws. In this paper, we study GNNs with link local differential privacy over decentralized nodes, where an untrusted server collaborates with node clients to train a GNN model without revealing the existence of any link. We find that by spending the privacy budget independently on links and degrees of the graph, the server can use Bayesian estimation to better denoise the graph topology. Unlike existing approaches, our mechanism does not aim to preserve graph density, but allows the server to estimate fewer links under lower privacy budget and higher uncertainty. Hence, the server makes fewer false positive link estimations and trains better models. Finally, we conduct extensive experiments to demonstrate that our method achieves considerably better performance with higher accuracy under same privacy budget compared to existing approaches.\",\"PeriodicalId\":161812,\"journal\":{\"name\":\"Companion of the 2023 International Conference on Management of Data\",\"volume\":\"33 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-06-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Companion of the 2023 International Conference on Management of Data\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3555041.3589398\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Companion of the 2023 International Conference on Management of Data","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3555041.3589398","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Link Local Differential Privacy in GNNs via Bayesian Estimation
Recent years have witnessed the emergence of graph neural networks (GNNs) and an increasing amount of attention on GNNs from the data management community. Yet, training GNNs may raise privacy concerns as they may reveal sensitive information that must be kept private according to laws. In this paper, we study GNNs with link local differential privacy over decentralized nodes, where an untrusted server collaborates with node clients to train a GNN model without revealing the existence of any link. We find that by spending the privacy budget independently on links and degrees of the graph, the server can use Bayesian estimation to better denoise the graph topology. Unlike existing approaches, our mechanism does not aim to preserve graph density, but allows the server to estimate fewer links under lower privacy budget and higher uncertainty. Hence, the server makes fewer false positive link estimations and trains better models. Finally, we conduct extensive experiments to demonstrate that our method achieves considerably better performance with higher accuracy under same privacy budget compared to existing approaches.