{"title":"去随机化BPP:最先进的技术","authors":"A. Wigderson","doi":"10.1109/CCC.1999.766263","DOIUrl":null,"url":null,"abstract":"The introduction of randomization into efficient computation has been one of the most fertile and useful ideas in computer science. In cryptography and asynchronous computing, randomization makes possible tasks that are impossible to perform deterministically. Even for function computation, many examples are known in which randomization allows considerable savings in resources like space and time over deterministic algorithms, or even \"only\" simplifies them. But to what extent is this seeming power of randomness over determinism real? The most famous concrete version of this question regards the power of BPP, the class of problems solvable by probabilistic polynomial time algorithms making small constant error. What is the relative power of such algorithms compared to deterministic ones? This is largely open. On the one hand, it is possible that P=BPP, i.e., randomness is useless for solving new problems in polynomial-time. On the other, we might have BPP=EXP, which would say that randomness would be a nearly omnipotent tool for algorithm design. The only viable path towards resolving this problem was the concept of \"pseudorandom generators\", and the \"hardness vs. randomness\" paradigm: BPP can be nontrivially simulated by deterministic algorithms, if some hard function is available. While the hard functions above needed in fact to be one-way functions, completely different pseudo-random generators allowed the use of any hard function in EXP for such nontrivial simulation. Further progress considerably weakened the hardness requirement, and considerably strengthened the deterministic simulation.","PeriodicalId":432015,"journal":{"name":"Proceedings. Fourteenth Annual IEEE Conference on Computational Complexity (Formerly: Structure in Complexity Theory Conference) (Cat.No.99CB36317)","volume":"164 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1999-05-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"De-randomizing BPP: the state of the art\",\"authors\":\"A. Wigderson\",\"doi\":\"10.1109/CCC.1999.766263\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The introduction of randomization into efficient computation has been one of the most fertile and useful ideas in computer science. In cryptography and asynchronous computing, randomization makes possible tasks that are impossible to perform deterministically. Even for function computation, many examples are known in which randomization allows considerable savings in resources like space and time over deterministic algorithms, or even \\\"only\\\" simplifies them. But to what extent is this seeming power of randomness over determinism real? The most famous concrete version of this question regards the power of BPP, the class of problems solvable by probabilistic polynomial time algorithms making small constant error. What is the relative power of such algorithms compared to deterministic ones? This is largely open. On the one hand, it is possible that P=BPP, i.e., randomness is useless for solving new problems in polynomial-time. On the other, we might have BPP=EXP, which would say that randomness would be a nearly omnipotent tool for algorithm design. The only viable path towards resolving this problem was the concept of \\\"pseudorandom generators\\\", and the \\\"hardness vs. randomness\\\" paradigm: BPP can be nontrivially simulated by deterministic algorithms, if some hard function is available. While the hard functions above needed in fact to be one-way functions, completely different pseudo-random generators allowed the use of any hard function in EXP for such nontrivial simulation. Further progress considerably weakened the hardness requirement, and considerably strengthened the deterministic simulation.\",\"PeriodicalId\":432015,\"journal\":{\"name\":\"Proceedings. Fourteenth Annual IEEE Conference on Computational Complexity (Formerly: Structure in Complexity Theory Conference) (Cat.No.99CB36317)\",\"volume\":\"164 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1999-05-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings. Fourteenth Annual IEEE Conference on Computational Complexity (Formerly: Structure in Complexity Theory Conference) (Cat.No.99CB36317)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CCC.1999.766263\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings. Fourteenth Annual IEEE Conference on Computational Complexity (Formerly: Structure in Complexity Theory Conference) (Cat.No.99CB36317)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCC.1999.766263","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
The introduction of randomization into efficient computation has been one of the most fertile and useful ideas in computer science. In cryptography and asynchronous computing, randomization makes possible tasks that are impossible to perform deterministically. Even for function computation, many examples are known in which randomization allows considerable savings in resources like space and time over deterministic algorithms, or even "only" simplifies them. But to what extent is this seeming power of randomness over determinism real? The most famous concrete version of this question regards the power of BPP, the class of problems solvable by probabilistic polynomial time algorithms making small constant error. What is the relative power of such algorithms compared to deterministic ones? This is largely open. On the one hand, it is possible that P=BPP, i.e., randomness is useless for solving new problems in polynomial-time. On the other, we might have BPP=EXP, which would say that randomness would be a nearly omnipotent tool for algorithm design. The only viable path towards resolving this problem was the concept of "pseudorandom generators", and the "hardness vs. randomness" paradigm: BPP can be nontrivially simulated by deterministic algorithms, if some hard function is available. While the hard functions above needed in fact to be one-way functions, completely different pseudo-random generators allowed the use of any hard function in EXP for such nontrivial simulation. Further progress considerably weakened the hardness requirement, and considerably strengthened the deterministic simulation.