{"title":"Near-optimal conversion of hardness into pseudo-randomness","authors":"R. Impagliazzo, Ronen Shaltiel, A. Wigderson","doi":"10.1109/SFFCS.1999.814590","DOIUrl":null,"url":null,"abstract":"Various efforts have been made to derandomize probabilistic algorithms using the assumption that there exists a problem in E=dtime(2/sup O(n)/) that requires circuits of size s(n) (for some function s). These results are based on the NW (Nisan & Wigderson, 1997) generator. For the strong lower bound s(n)=2/sup /spl epsi/n/, the optimal derandomization is P=BPP. However, for weaker lower bound functions s(n), these constructions fall short of the natural conjecture for optimal derandomization that bptime(t)/spl sube/ dtime(2¿O[s/sup -1/(t)]). The gap is due to an inherent efficiency limitation in NW-style pseudorandom generators. We are able to obtain derandomization in almost optimal time using any lower bound s(n). We do this by using the NW-generator in a more sophisticated way. We view any failure of the generator as a reduction from the given hard function to its restrictions on smaller input sizes. Thus, either the original construction works optimally or one of the restricted functions is as hard as the original. Any such restriction can then be plugged into the NW-generator recursively. This process generates many candidate generators, and at least one is guaranteed to be good. To perform the approximation of the acceptance probability of the given circuit, we run a tournament between the candidate generators which yields an accurate estimate. We explore information theoretic analogs of our new construction. The inherent limitation of the NW-generator makes the extra randomness required by that extractor suboptimal. However, applying our construction, we get an almost optimal disperser.","PeriodicalId":385047,"journal":{"name":"40th Annual Symposium on Foundations of Computer Science (Cat. No.99CB37039)","volume":"51 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1999-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"50","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"40th Annual Symposium on Foundations of Computer Science (Cat. No.99CB37039)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SFFCS.1999.814590","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 50
Abstract
Various efforts have been made to derandomize probabilistic algorithms using the assumption that there exists a problem in E=dtime(2/sup O(n)/) that requires circuits of size s(n) (for some function s). These results are based on the NW (Nisan & Wigderson, 1997) generator. For the strong lower bound s(n)=2/sup /spl epsi/n/, the optimal derandomization is P=BPP. However, for weaker lower bound functions s(n), these constructions fall short of the natural conjecture for optimal derandomization that bptime(t)/spl sube/ dtime(2¿O[s/sup -1/(t)]). The gap is due to an inherent efficiency limitation in NW-style pseudorandom generators. We are able to obtain derandomization in almost optimal time using any lower bound s(n). We do this by using the NW-generator in a more sophisticated way. We view any failure of the generator as a reduction from the given hard function to its restrictions on smaller input sizes. Thus, either the original construction works optimally or one of the restricted functions is as hard as the original. Any such restriction can then be plugged into the NW-generator recursively. This process generates many candidate generators, and at least one is guaranteed to be good. To perform the approximation of the acceptance probability of the given circuit, we run a tournament between the candidate generators which yields an accurate estimate. We explore information theoretic analogs of our new construction. The inherent limitation of the NW-generator makes the extra randomness required by that extractor suboptimal. However, applying our construction, we get an almost optimal disperser.