Chun-Yan Wei, Qing-Le Wang, Xiao-Qiu Cai, Tian-Yin Wang
{"title":"Improving User Privacy in Practical Quantum Private Query with Group Honesty Checking","authors":"Chun-Yan Wei, Qing-Le Wang, Xiao-Qiu Cai, Tian-Yin Wang","doi":"10.1002/qute.202400429","DOIUrl":null,"url":null,"abstract":"<p>Current cheat-sensitive security level of user privacy in quantum private query (QPQ) is far from meeting its ideal requirement. Dishonest database trying to elicit user privacy can only be (delayedly) detected after the finish of the protocol with merely a nonzero probability. Worse yet, no estimation of <span></span><math>\n <semantics>\n <msub>\n <mi>p</mi>\n <mrow>\n <mi>s</mi>\n <mi>u</mi>\n <mi>c</mi>\n <mi>c</mi>\n </mrow>\n </msub>\n <annotation>$p_{succ}$</annotation>\n </semantics></math>(i.e., the success probability of dishonest database's cheating) has been given till now. Such estimation is quite necessary because a significant <span></span><math>\n <semantics>\n <msub>\n <mi>p</mi>\n <mrow>\n <mi>s</mi>\n <mi>u</mi>\n <mi>c</mi>\n <mi>c</mi>\n </mrow>\n </msub>\n <annotation>$p_{succ}$</annotation>\n </semantics></math> means frangible user privacy especially in the cheat-sensitive security model. Here, <span></span><math>\n <semantics>\n <msub>\n <mi>p</mi>\n <mrow>\n <mi>s</mi>\n <mi>u</mi>\n <mi>c</mi>\n <mi>c</mi>\n </mrow>\n </msub>\n <annotation>$p_{succ}$</annotation>\n </semantics></math> of the first and best-known quantum-key-distribution (QKD)-based QPQ protocol proposed by Jakobi et al. is estimated, which shows that dishonest database can elicit user privacy with significant probability (e.g., as high as 42.8% for database size <span></span><math>\n <semantics>\n <mrow>\n <mi>N</mi>\n <mo>=</mo>\n <mn>10000</mn>\n </mrow>\n <annotation>$N=10000$</annotation>\n </semantics></math>) while such cheating can only be (delayedly) detected with probability 50 %. Common strategy to improve user privacy, i.e., adding honesty checking to detect malicious database may hurt the privacy of the other party, i.e. database security. To solve this problem, a new group honesty checking is proposed, which will not hurt database security and can reduce $ <span></span><math>\n <semantics>\n <msub>\n <mi>p</mi>\n <mrow>\n <mi>s</mi>\n <mi>u</mi>\n <mi>c</mi>\n <mi>c</mi>\n </mrow>\n </msub>\n <annotation>$p_{succ}$</annotation>\n </semantics></math> to a very small value (e.g. 0.26% for database size 10000), thus assuring high user privacy (note that <span></span><math>\n <semantics>\n <msub>\n <mi>p</mi>\n <mrow>\n <mi>s</mi>\n <mi>u</mi>\n <mi>c</mi>\n <mi>c</mi>\n </mrow>\n </msub>\n <annotation>$p_{succ}$</annotation>\n </semantics></math> = 0 means ideal user privacy).</p>","PeriodicalId":72073,"journal":{"name":"Advanced quantum technologies","volume":"8 3","pages":""},"PeriodicalIF":4.4000,"publicationDate":"2025-02-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advanced quantum technologies","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/qute.202400429","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"OPTICS","Score":null,"Total":0}
引用次数: 0
Abstract
Current cheat-sensitive security level of user privacy in quantum private query (QPQ) is far from meeting its ideal requirement. Dishonest database trying to elicit user privacy can only be (delayedly) detected after the finish of the protocol with merely a nonzero probability. Worse yet, no estimation of (i.e., the success probability of dishonest database's cheating) has been given till now. Such estimation is quite necessary because a significant means frangible user privacy especially in the cheat-sensitive security model. Here, of the first and best-known quantum-key-distribution (QKD)-based QPQ protocol proposed by Jakobi et al. is estimated, which shows that dishonest database can elicit user privacy with significant probability (e.g., as high as 42.8% for database size ) while such cheating can only be (delayedly) detected with probability 50 %. Common strategy to improve user privacy, i.e., adding honesty checking to detect malicious database may hurt the privacy of the other party, i.e. database security. To solve this problem, a new group honesty checking is proposed, which will not hurt database security and can reduce $ to a very small value (e.g. 0.26% for database size 10000), thus assuring high user privacy (note that = 0 means ideal user privacy).