Shang Liu, Hao Du, Yang Cao, Bo Yan, Jinfei Liu, Masatoshi Yoshikawa
{"title":"PGB:差异化私有合成图生成算法基准测试","authors":"Shang Liu, Hao Du, Yang Cao, Bo Yan, Jinfei Liu, Masatoshi Yoshikawa","doi":"arxiv-2408.02928","DOIUrl":null,"url":null,"abstract":"Differentially private graph analysis is a powerful tool for deriving\ninsights from diverse graph data while protecting individual information.\nDesigning private analytic algorithms for different graph queries often\nrequires starting from scratch. In contrast, differentially private synthetic\ngraph generation offers a general paradigm that supports one-time generation\nfor multiple queries. Although a rich set of differentially private graph\ngeneration algorithms has been proposed, comparing them effectively remains\nchallenging due to various factors, including differing privacy definitions,\ndiverse graph datasets, varied privacy requirements, and multiple utility\nmetrics. To this end, we propose PGB (Private Graph Benchmark), a comprehensive\nbenchmark designed to enable researchers to compare differentially private\ngraph generation algorithms fairly. We begin by identifying four essential\nelements of existing works as a 4-tuple: mechanisms, graph datasets, privacy\nrequirements, and utility metrics. We discuss principles regarding these\nelements to ensure the comprehensiveness of a benchmark. Next, we present a\nbenchmark instantiation that adheres to all principles, establishing a new\nmethod to evaluate existing and newly proposed graph generation algorithms.\nThrough extensive theoretical and empirical analysis, we gain valuable insights\ninto the strengths and weaknesses of prior algorithms. Our results indicate\nthat there is no universal solution for all possible cases. Finally, we provide\nguidelines to help researchers select appropriate mechanisms for various\nscenarios.","PeriodicalId":501123,"journal":{"name":"arXiv - CS - Databases","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-08-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"PGB: Benchmarking Differentially Private Synthetic Graph Generation Algorithms\",\"authors\":\"Shang Liu, Hao Du, Yang Cao, Bo Yan, Jinfei Liu, Masatoshi Yoshikawa\",\"doi\":\"arxiv-2408.02928\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Differentially private graph analysis is a powerful tool for deriving\\ninsights from diverse graph data while protecting individual information.\\nDesigning private analytic algorithms for different graph queries often\\nrequires starting from scratch. In contrast, differentially private synthetic\\ngraph generation offers a general paradigm that supports one-time generation\\nfor multiple queries. Although a rich set of differentially private graph\\ngeneration algorithms has been proposed, comparing them effectively remains\\nchallenging due to various factors, including differing privacy definitions,\\ndiverse graph datasets, varied privacy requirements, and multiple utility\\nmetrics. To this end, we propose PGB (Private Graph Benchmark), a comprehensive\\nbenchmark designed to enable researchers to compare differentially private\\ngraph generation algorithms fairly. We begin by identifying four essential\\nelements of existing works as a 4-tuple: mechanisms, graph datasets, privacy\\nrequirements, and utility metrics. We discuss principles regarding these\\nelements to ensure the comprehensiveness of a benchmark. Next, we present a\\nbenchmark instantiation that adheres to all principles, establishing a new\\nmethod to evaluate existing and newly proposed graph generation algorithms.\\nThrough extensive theoretical and empirical analysis, we gain valuable insights\\ninto the strengths and weaknesses of prior algorithms. Our results indicate\\nthat there is no universal solution for all possible cases. Finally, we provide\\nguidelines to help researchers select appropriate mechanisms for various\\nscenarios.\",\"PeriodicalId\":501123,\"journal\":{\"name\":\"arXiv - CS - Databases\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Databases\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2408.02928\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Databases","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2408.02928","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Differentially private graph analysis is a powerful tool for deriving
insights from diverse graph data while protecting individual information.
Designing private analytic algorithms for different graph queries often
requires starting from scratch. In contrast, differentially private synthetic
graph generation offers a general paradigm that supports one-time generation
for multiple queries. Although a rich set of differentially private graph
generation algorithms has been proposed, comparing them effectively remains
challenging due to various factors, including differing privacy definitions,
diverse graph datasets, varied privacy requirements, and multiple utility
metrics. To this end, we propose PGB (Private Graph Benchmark), a comprehensive
benchmark designed to enable researchers to compare differentially private
graph generation algorithms fairly. We begin by identifying four essential
elements of existing works as a 4-tuple: mechanisms, graph datasets, privacy
requirements, and utility metrics. We discuss principles regarding these
elements to ensure the comprehensiveness of a benchmark. Next, we present a
benchmark instantiation that adheres to all principles, establishing a new
method to evaluate existing and newly proposed graph generation algorithms.
Through extensive theoretical and empirical analysis, we gain valuable insights
into the strengths and weaknesses of prior algorithms. Our results indicate
that there is no universal solution for all possible cases. Finally, we provide
guidelines to help researchers select appropriate mechanisms for various
scenarios.