{"title":"从非常困难的函数中简单而快速地去随机化:几乎没有成本地消除随机性","authors":"Lijie Chen, R. Tell","doi":"10.1145/3406325.3451059","DOIUrl":null,"url":null,"abstract":"Extending the classical “hardness-to-randomness” line-of-works, Doron, Moshkovitz, Oh, and Zuckerman (FOCS 2020) recently proved that derandomization with near-quadratic time overhead is possible, under the assumption that there exists a function in DTIME[2n] that cannot be computed by randomized SVN circuits of size 2(1−є)· n for a small є. In this work we extend their inquiry and answer several open questions that arose from their work. For a time function T(n), consider the following assumption: Non-uniformly secure one-way functions exist, and for δ=δ(є) and k=kT(є) there exists a problem in DTIME[2k· n] that is hard for algorithms that run in time 2(k−δ)· n and use 2(1−δ)· n bits of advice. Under this assumption, we show that: 1. (Worst-case derandomization.) Probabilistic algorithms that run in time T(n) can be deterministically simulated in time n· T(n)1+є. 2. (Average-case derandomization.) For polynomial time functions T(n)=poly(n), we can improve the derandomization time to nє· T(n) if we allow the derandomization to succeed only on average, rather than in the worst-case. 3. (Conditional optimality.) For worst-case derandomization, the multiplicative time overhead of n is essentially optimal, conditioned on a counting version of the non-deterministic strong exponential-time hypothesis (i.e., on #NSETH). Lastly, we present an alternative proof for the result of Doron, Moshkovitz, Oh, and Zuckerman that is simpler and more versatile. In fact, we show how to simplify the analysis not only of their construction, but of any construction that “extracts randomness from a pseudoentropic string”.","PeriodicalId":132752,"journal":{"name":"Proceedings of the 53rd Annual ACM SIGACT Symposium on Theory of Computing","volume":"106 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-06-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"16","resultStr":"{\"title\":\"Simple and fast derandomization from very hard functions: eliminating randomness at almost no cost\",\"authors\":\"Lijie Chen, R. Tell\",\"doi\":\"10.1145/3406325.3451059\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Extending the classical “hardness-to-randomness” line-of-works, Doron, Moshkovitz, Oh, and Zuckerman (FOCS 2020) recently proved that derandomization with near-quadratic time overhead is possible, under the assumption that there exists a function in DTIME[2n] that cannot be computed by randomized SVN circuits of size 2(1−є)· n for a small є. In this work we extend their inquiry and answer several open questions that arose from their work. For a time function T(n), consider the following assumption: Non-uniformly secure one-way functions exist, and for δ=δ(є) and k=kT(є) there exists a problem in DTIME[2k· n] that is hard for algorithms that run in time 2(k−δ)· n and use 2(1−δ)· n bits of advice. Under this assumption, we show that: 1. (Worst-case derandomization.) Probabilistic algorithms that run in time T(n) can be deterministically simulated in time n· T(n)1+є. 2. (Average-case derandomization.) For polynomial time functions T(n)=poly(n), we can improve the derandomization time to nє· T(n) if we allow the derandomization to succeed only on average, rather than in the worst-case. 3. (Conditional optimality.) For worst-case derandomization, the multiplicative time overhead of n is essentially optimal, conditioned on a counting version of the non-deterministic strong exponential-time hypothesis (i.e., on #NSETH). Lastly, we present an alternative proof for the result of Doron, Moshkovitz, Oh, and Zuckerman that is simpler and more versatile. In fact, we show how to simplify the analysis not only of their construction, but of any construction that “extracts randomness from a pseudoentropic string”.\",\"PeriodicalId\":132752,\"journal\":{\"name\":\"Proceedings of the 53rd Annual ACM SIGACT Symposium on Theory of Computing\",\"volume\":\"106 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-06-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"16\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 53rd Annual ACM SIGACT Symposium on Theory of Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3406325.3451059\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 53rd Annual ACM SIGACT Symposium on Theory of Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3406325.3451059","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Simple and fast derandomization from very hard functions: eliminating randomness at almost no cost
Extending the classical “hardness-to-randomness” line-of-works, Doron, Moshkovitz, Oh, and Zuckerman (FOCS 2020) recently proved that derandomization with near-quadratic time overhead is possible, under the assumption that there exists a function in DTIME[2n] that cannot be computed by randomized SVN circuits of size 2(1−є)· n for a small є. In this work we extend their inquiry and answer several open questions that arose from their work. For a time function T(n), consider the following assumption: Non-uniformly secure one-way functions exist, and for δ=δ(є) and k=kT(є) there exists a problem in DTIME[2k· n] that is hard for algorithms that run in time 2(k−δ)· n and use 2(1−δ)· n bits of advice. Under this assumption, we show that: 1. (Worst-case derandomization.) Probabilistic algorithms that run in time T(n) can be deterministically simulated in time n· T(n)1+є. 2. (Average-case derandomization.) For polynomial time functions T(n)=poly(n), we can improve the derandomization time to nє· T(n) if we allow the derandomization to succeed only on average, rather than in the worst-case. 3. (Conditional optimality.) For worst-case derandomization, the multiplicative time overhead of n is essentially optimal, conditioned on a counting version of the non-deterministic strong exponential-time hypothesis (i.e., on #NSETH). Lastly, we present an alternative proof for the result of Doron, Moshkovitz, Oh, and Zuckerman that is simpler and more versatile. In fact, we show how to simplify the analysis not only of their construction, but of any construction that “extracts randomness from a pseudoentropic string”.