Research on Quantum Computing Technology and Application

Mengliang Li, Hong Yang, Xiong Guo
{"title":"Research on Quantum Computing Technology and Application","authors":"Mengliang Li, Hong Yang, Xiong Guo","doi":"10.2991/MASTA-19.2019.30","DOIUrl":null,"url":null,"abstract":"Quantum computing is a novelty type of calculation mode that follows the rules of quantum mechanics regulating quantum information units. The general theoretical model of a quantum computer is a universal Turing machine that is reinterpreted by using the laws of quantum mechanics. Quantum computing can greatly improve the computational efficiency through the principle of quantum superposition, which is widely used in artificial intelligence (AI), accurate weather forecast, traffic congestion management and other fields. In the past 10 years, quantum computing has been increasing in technology, the number of products and the scale of industry. This paper will introduce the history, concepts, current research status, main technology and applications of quantum computing, and the development trends of quantum computing will be presented. Introduction For most of computing history, the foundational hardware technology has been binary digital transistor logic. In such digital systems, data and programs are encoded into binary digits (bits) based on two states: on and off. The field of quantum computing introduces a whole new approach to the underlying computing hardware by shifting from simple binary (two-state) logic to a more powerful multi-state logic using a new notion of bit, known as “quantum bits” or “qubits” which are represented by quantum as superposition and entanglement[1]. This shift from a binary digital representation found in today’s conventional computers to a quantum digital representation in tomorrow’s computers will bring huge increases in computing power and new, innovative software that handles today’s hugely complex distributed computational problems and provides more powerful analysis of today’s complex data patterns [2]. Quantum computing holds the potential to revolutionize fields from chemistry and logistics to finance and physics. However, the increase in power and capability that quantum computing will provide, will also be seen as a dire threat because it can easily defeat today’s encryption mechanisms, which have all been built using pre-quantum computing approaches. As strong as today’s encryption mechanisms have been, they wouldn’t stand a chance against a quantum computing-based attack. This widely known risk associated with the power of quantum computing is very concerning for governments, institutions and individuals whose encrypted files are safe today, but may not be in 10-20 years when quantum computing takes off [8]. This paper will provide the history of quantum computing and its concepts. it will summarize the current research status, main technology and applications of quantum computing, as well as summarize the development trend associated with quantum computing. Basic Concepts of Quantum Computing Definition Quantum computing is a fresh type of calculation mode that follows the rules of quantum mechanics regulating quantum information units. Quantum computing is a kind of parallel computing, which takes entangled quantum states as the carrier of information transmission, and uses the linear superposition principle of quantum states to complete the parallel computing and ultimately get the required information. Quantum computers have very high parallel computing power, which can be International Conference on Modeling, Analysis, Simulation Technologies and Applications (MASTA 2019) Copyright © 2019, the Authors. Published by Atlantis Press. This is an open access article under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/). Advances in Intelligent Systems Research, volume 168","PeriodicalId":103896,"journal":{"name":"Proceedings of the 2019 International Conference on Modeling, Analysis, Simulation Technologies and Applications (MASTA 2019)","volume":"61 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2019 International Conference on Modeling, Analysis, Simulation Technologies and Applications (MASTA 2019)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2991/MASTA-19.2019.30","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Quantum computing is a novelty type of calculation mode that follows the rules of quantum mechanics regulating quantum information units. The general theoretical model of a quantum computer is a universal Turing machine that is reinterpreted by using the laws of quantum mechanics. Quantum computing can greatly improve the computational efficiency through the principle of quantum superposition, which is widely used in artificial intelligence (AI), accurate weather forecast, traffic congestion management and other fields. In the past 10 years, quantum computing has been increasing in technology, the number of products and the scale of industry. This paper will introduce the history, concepts, current research status, main technology and applications of quantum computing, and the development trends of quantum computing will be presented. Introduction For most of computing history, the foundational hardware technology has been binary digital transistor logic. In such digital systems, data and programs are encoded into binary digits (bits) based on two states: on and off. The field of quantum computing introduces a whole new approach to the underlying computing hardware by shifting from simple binary (two-state) logic to a more powerful multi-state logic using a new notion of bit, known as “quantum bits” or “qubits” which are represented by quantum as superposition and entanglement[1]. This shift from a binary digital representation found in today’s conventional computers to a quantum digital representation in tomorrow’s computers will bring huge increases in computing power and new, innovative software that handles today’s hugely complex distributed computational problems and provides more powerful analysis of today’s complex data patterns [2]. Quantum computing holds the potential to revolutionize fields from chemistry and logistics to finance and physics. However, the increase in power and capability that quantum computing will provide, will also be seen as a dire threat because it can easily defeat today’s encryption mechanisms, which have all been built using pre-quantum computing approaches. As strong as today’s encryption mechanisms have been, they wouldn’t stand a chance against a quantum computing-based attack. This widely known risk associated with the power of quantum computing is very concerning for governments, institutions and individuals whose encrypted files are safe today, but may not be in 10-20 years when quantum computing takes off [8]. This paper will provide the history of quantum computing and its concepts. it will summarize the current research status, main technology and applications of quantum computing, as well as summarize the development trend associated with quantum computing. Basic Concepts of Quantum Computing Definition Quantum computing is a fresh type of calculation mode that follows the rules of quantum mechanics regulating quantum information units. Quantum computing is a kind of parallel computing, which takes entangled quantum states as the carrier of information transmission, and uses the linear superposition principle of quantum states to complete the parallel computing and ultimately get the required information. Quantum computers have very high parallel computing power, which can be International Conference on Modeling, Analysis, Simulation Technologies and Applications (MASTA 2019) Copyright © 2019, the Authors. Published by Atlantis Press. This is an open access article under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/). Advances in Intelligent Systems Research, volume 168
量子计算技术与应用研究
量子计算是一种新的计算模式,它遵循量子力学规则来调节量子信息单元。量子计算机的一般理论模型是用量子力学定律重新解释的通用图灵机。量子计算可以通过量子叠加原理大大提高计算效率,被广泛应用于人工智能(AI)、精准天气预报、交通拥堵管理等领域。在过去的10年里,量子计算在技术、产品数量和产业规模上都在不断增长。本文将介绍量子计算的历史、概念、研究现状、主要技术和应用,并提出量子计算的发展趋势。在大多数计算历史中,基础硬件技术一直是二进制数字晶体管逻辑。在这样的数字系统中,数据和程序根据开和关两种状态被编码成二进制数字(位)。量子计算领域引入了一种全新的底层计算硬件方法,从简单的二进制(两态)逻辑转向更强大的多态逻辑,使用一种新的比特概念,称为“量子比特”或“量子位”,用量子表示为叠加和纠缠[1]。这种从今天传统计算机中的二进制数字表示到未来计算机中的量子数字表示的转变将带来计算能力的巨大增长和新的创新软件,这些软件可以处理当今极其复杂的分布式计算问题,并提供对当今复杂数据模式的更强大的分析。量子计算有可能彻底改变从化学和物流到金融和物理的各个领域。然而,量子计算将提供的功率和能力的增加也将被视为一种可怕的威胁,因为它可以很容易地击败今天的加密机制,这些机制都是使用前量子计算方法构建的。尽管当今的加密机制非常强大,但它们无法抵御基于量子计算的攻击。这种众所周知的风险与量子计算的力量有关,这对政府、机构和个人来说非常令人担忧,他们的加密文件今天是安全的,但在10-20年后,当量子计算起飞时,可能就不是这样了。本文将提供量子计算的历史及其概念。概述了量子计算的研究现状、主要技术和应用,并总结了与量子计算相关的发展趋势。量子计算是一种遵循量子力学规律调节量子信息单元的新型计算方式。量子计算是一种并行计算,以纠缠量子态作为信息传输的载体,利用量子态的线性叠加原理完成并行计算,最终得到所需的信息。量子计算机具有非常高的并行计算能力,可以参加国际建模、分析、仿真技术与应用会议(MASTA 2019)版权所有©2019,作者。亚特兰蒂斯出版社出版。这是一篇基于CC BY-NC许可(http://creativecommons.org/licenses/by-nc/4.0/)的开放获取文章。智能系统研究进展,第168卷
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信