{"title":"亚原子尺度上的量子计算及其对摩尔定律未来的影响","authors":"N. Lori, J. Neves, A. Blin, Victor Alves","doi":"10.26421/QIC20.1-2-1","DOIUrl":null,"url":null,"abstract":"The contemporary development of Quantum Computers has opened new possibilities for computation improvements, but the limits of Moore’s law validity are starting to show. We analyze here the possibility that miniaturization will continue to be the source of Moore’s law validity in the near future, and our conclusion is that miniaturization is no longer a reliable answer for the future development of computer science, but instead we suggest that lateralization is the correct approach. By lateralization, we mean the use of biology as the correct format for the implementation of ubiquitous computerized systems, a format that might in many circumstances eschew miniaturization as an overly expensive useless advantage whereas in other cases miniaturization might play a key role. Thus, the future of computer science is not towards a miniaturization that goes from the atom-scale (its present application scale) towards the nucleus-scale, but rather in developing more integrated circuits at the micrometer to nanometer scale, so as to better mimic and interact with biological systems. We analyze some ”almost sci-fi” approaches to the development of better computer systems near the Bekenstein bound limit, and unsurprisingly they fail to have any realistic feasibility. Then, we use the difference between the classical vs. quantum version of the Hammerstein-Clifford theorem to explain why biological systems eschewed quantum computation to represent the world but have chosen classical computation instead. Finally, we analyze examples of recent work which indicate future possibilities of integration between computers and biological systems. As a corollary of that choice by the biological systems, we propose that the predicted lateralization-driven evolution in computer science will not be based in quantum computers, but rather in classical computers.","PeriodicalId":20904,"journal":{"name":"Quantum Inf. Comput.","volume":"46 1","pages":"1-13"},"PeriodicalIF":0.0000,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Some considerations on quantum computing at sub-atomic scales and its impact in the future of Moore's law\",\"authors\":\"N. Lori, J. Neves, A. Blin, Victor Alves\",\"doi\":\"10.26421/QIC20.1-2-1\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The contemporary development of Quantum Computers has opened new possibilities for computation improvements, but the limits of Moore’s law validity are starting to show. We analyze here the possibility that miniaturization will continue to be the source of Moore’s law validity in the near future, and our conclusion is that miniaturization is no longer a reliable answer for the future development of computer science, but instead we suggest that lateralization is the correct approach. By lateralization, we mean the use of biology as the correct format for the implementation of ubiquitous computerized systems, a format that might in many circumstances eschew miniaturization as an overly expensive useless advantage whereas in other cases miniaturization might play a key role. Thus, the future of computer science is not towards a miniaturization that goes from the atom-scale (its present application scale) towards the nucleus-scale, but rather in developing more integrated circuits at the micrometer to nanometer scale, so as to better mimic and interact with biological systems. We analyze some ”almost sci-fi” approaches to the development of better computer systems near the Bekenstein bound limit, and unsurprisingly they fail to have any realistic feasibility. Then, we use the difference between the classical vs. quantum version of the Hammerstein-Clifford theorem to explain why biological systems eschewed quantum computation to represent the world but have chosen classical computation instead. Finally, we analyze examples of recent work which indicate future possibilities of integration between computers and biological systems. As a corollary of that choice by the biological systems, we propose that the predicted lateralization-driven evolution in computer science will not be based in quantum computers, but rather in classical computers.\",\"PeriodicalId\":20904,\"journal\":{\"name\":\"Quantum Inf. Comput.\",\"volume\":\"46 1\",\"pages\":\"1-13\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Quantum Inf. Comput.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.26421/QIC20.1-2-1\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Quantum Inf. Comput.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.26421/QIC20.1-2-1","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Some considerations on quantum computing at sub-atomic scales and its impact in the future of Moore's law
The contemporary development of Quantum Computers has opened new possibilities for computation improvements, but the limits of Moore’s law validity are starting to show. We analyze here the possibility that miniaturization will continue to be the source of Moore’s law validity in the near future, and our conclusion is that miniaturization is no longer a reliable answer for the future development of computer science, but instead we suggest that lateralization is the correct approach. By lateralization, we mean the use of biology as the correct format for the implementation of ubiquitous computerized systems, a format that might in many circumstances eschew miniaturization as an overly expensive useless advantage whereas in other cases miniaturization might play a key role. Thus, the future of computer science is not towards a miniaturization that goes from the atom-scale (its present application scale) towards the nucleus-scale, but rather in developing more integrated circuits at the micrometer to nanometer scale, so as to better mimic and interact with biological systems. We analyze some ”almost sci-fi” approaches to the development of better computer systems near the Bekenstein bound limit, and unsurprisingly they fail to have any realistic feasibility. Then, we use the difference between the classical vs. quantum version of the Hammerstein-Clifford theorem to explain why biological systems eschewed quantum computation to represent the world but have chosen classical computation instead. Finally, we analyze examples of recent work which indicate future possibilities of integration between computers and biological systems. As a corollary of that choice by the biological systems, we propose that the predicted lateralization-driven evolution in computer science will not be based in quantum computers, but rather in classical computers.