Yuval Filmus, Itai Leigh, Artur Riazanov, Dmitry Sokolov
{"title":"对称函数的抽样和证明","authors":"Yuval Filmus, Itai Leigh, Artur Riazanov, Dmitry Sokolov","doi":"10.48550/arXiv.2305.04363","DOIUrl":null,"url":null,"abstract":"A circuit $\\mathcal{C}$ samples a distribution $\\mathbf{X}$ with an error $\\epsilon$ if the statistical distance between the output of $\\mathcal{C}$ on the uniform input and $\\mathbf{X}$ is $\\epsilon$. We study the hardness of sampling a uniform distribution over the set of $n$-bit strings of Hamming weight $k$ denoted by $\\mathbf{U}^n_k$ for _decision forests_, i.e. every output bit is computed as a decision tree of the inputs. For every $k$ there is an $O(\\log n)$-depth decision forest sampling $\\mathbf{U}^n_k$ with an inverse-polynomial error [Viola 2012, Czumaj 2015]. We show that for every $\\epsilon>0$ there exists $\\tau$ such that for decision depth $\\tau \\log (n/k) / \\log \\log (n/k)$, the error for sampling $\\mathbf{U}_k^n$ is at least $1-\\epsilon$. Our result is based on the recent robust sunflower lemma [Alweiss, Lovett, Wu, Zhang 2021, Rao 2019]. Our second result is about matching a set of $n$-bit strings with the image of a $d$-_local_ circuit, i.e. such that each output bit depends on at most $d$ input bits. We study the set of all $n$-bit strings whose Hamming weight is at least $n/2$. We improve the previously known locality lower bound from $\\Omega(\\log^* n)$ [Beyersdorff, Datta, Krebs, Mahajan, Scharfenberger-Fabian, Sreenivasaiah, Thomas and Vollmer, 2013] to $\\Omega(\\sqrt{\\log n})$, leaving only a quartic gap from the best upper bound of $O(\\log^2 n)$.","PeriodicalId":11639,"journal":{"name":"Electron. Colloquium Comput. Complex.","volume":"29 1","pages":"36:1-36:21"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Sampling and Certifying Symmetric Functions\",\"authors\":\"Yuval Filmus, Itai Leigh, Artur Riazanov, Dmitry Sokolov\",\"doi\":\"10.48550/arXiv.2305.04363\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A circuit $\\\\mathcal{C}$ samples a distribution $\\\\mathbf{X}$ with an error $\\\\epsilon$ if the statistical distance between the output of $\\\\mathcal{C}$ on the uniform input and $\\\\mathbf{X}$ is $\\\\epsilon$. We study the hardness of sampling a uniform distribution over the set of $n$-bit strings of Hamming weight $k$ denoted by $\\\\mathbf{U}^n_k$ for _decision forests_, i.e. every output bit is computed as a decision tree of the inputs. For every $k$ there is an $O(\\\\log n)$-depth decision forest sampling $\\\\mathbf{U}^n_k$ with an inverse-polynomial error [Viola 2012, Czumaj 2015]. We show that for every $\\\\epsilon>0$ there exists $\\\\tau$ such that for decision depth $\\\\tau \\\\log (n/k) / \\\\log \\\\log (n/k)$, the error for sampling $\\\\mathbf{U}_k^n$ is at least $1-\\\\epsilon$. Our result is based on the recent robust sunflower lemma [Alweiss, Lovett, Wu, Zhang 2021, Rao 2019]. Our second result is about matching a set of $n$-bit strings with the image of a $d$-_local_ circuit, i.e. such that each output bit depends on at most $d$ input bits. We study the set of all $n$-bit strings whose Hamming weight is at least $n/2$. We improve the previously known locality lower bound from $\\\\Omega(\\\\log^* n)$ [Beyersdorff, Datta, Krebs, Mahajan, Scharfenberger-Fabian, Sreenivasaiah, Thomas and Vollmer, 2013] to $\\\\Omega(\\\\sqrt{\\\\log n})$, leaving only a quartic gap from the best upper bound of $O(\\\\log^2 n)$.\",\"PeriodicalId\":11639,\"journal\":{\"name\":\"Electron. Colloquium Comput. Complex.\",\"volume\":\"29 1\",\"pages\":\"36:1-36:21\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-05-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Electron. Colloquium Comput. Complex.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.48550/arXiv.2305.04363\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Electron. Colloquium Comput. Complex.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.48550/arXiv.2305.04363","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A circuit $\mathcal{C}$ samples a distribution $\mathbf{X}$ with an error $\epsilon$ if the statistical distance between the output of $\mathcal{C}$ on the uniform input and $\mathbf{X}$ is $\epsilon$. We study the hardness of sampling a uniform distribution over the set of $n$-bit strings of Hamming weight $k$ denoted by $\mathbf{U}^n_k$ for _decision forests_, i.e. every output bit is computed as a decision tree of the inputs. For every $k$ there is an $O(\log n)$-depth decision forest sampling $\mathbf{U}^n_k$ with an inverse-polynomial error [Viola 2012, Czumaj 2015]. We show that for every $\epsilon>0$ there exists $\tau$ such that for decision depth $\tau \log (n/k) / \log \log (n/k)$, the error for sampling $\mathbf{U}_k^n$ is at least $1-\epsilon$. Our result is based on the recent robust sunflower lemma [Alweiss, Lovett, Wu, Zhang 2021, Rao 2019]. Our second result is about matching a set of $n$-bit strings with the image of a $d$-_local_ circuit, i.e. such that each output bit depends on at most $d$ input bits. We study the set of all $n$-bit strings whose Hamming weight is at least $n/2$. We improve the previously known locality lower bound from $\Omega(\log^* n)$ [Beyersdorff, Datta, Krebs, Mahajan, Scharfenberger-Fabian, Sreenivasaiah, Thomas and Vollmer, 2013] to $\Omega(\sqrt{\log n})$, leaving only a quartic gap from the best upper bound of $O(\log^2 n)$.