{"title":"An Efficient QAOA via a Polynomial QPU-Needless Approach","authors":"F. Chicano, Z. Dahi, Gabriel Luque","doi":"10.1145/3583133.3596409","DOIUrl":null,"url":null,"abstract":"The Quantum Approximate Optimization Algorithm (QAOA) is a hybrid quantum algorithm described as ansatzes that represent both the problem and the mixer Hamiltonians. Both are parameterizable unitary transformations executed on a quantum machine/simulator and whose parameters are iteratively optimized using a classical device to optimize the problem's expectation value. To do so, in each QAOA iteration, most of the literature uses a quantum machine/simulator to measure the QAOA outcomes. However, this poses a severe bottleneck considering that quantum machines are hardly constrained (e.g. long queuing, limited qubits, etc.), likewise, quantum simulation also induces exponentially-increasing memory usage when dealing with large problems requiring more qubits. These limitations make today's QAOA implementation impractical since it is hard to obtain good solutions with a reasonably-acceptable time/resources. Considering these facts, this work presents a new approach with two main contributions, including (I) removing the need for accessing quantum devices or large-sized classical machines during the QAOA optimization phase, and (II) ensuring that when dealing with some k-bounded pseudo-Boolean problems, optimizing the exact problem's expectation value can be done in polynomial time using a classical computer.","PeriodicalId":422029,"journal":{"name":"Proceedings of the Companion Conference on Genetic and Evolutionary Computation","volume":"28 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Companion Conference on Genetic and Evolutionary Computation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3583133.3596409","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The Quantum Approximate Optimization Algorithm (QAOA) is a hybrid quantum algorithm described as ansatzes that represent both the problem and the mixer Hamiltonians. Both are parameterizable unitary transformations executed on a quantum machine/simulator and whose parameters are iteratively optimized using a classical device to optimize the problem's expectation value. To do so, in each QAOA iteration, most of the literature uses a quantum machine/simulator to measure the QAOA outcomes. However, this poses a severe bottleneck considering that quantum machines are hardly constrained (e.g. long queuing, limited qubits, etc.), likewise, quantum simulation also induces exponentially-increasing memory usage when dealing with large problems requiring more qubits. These limitations make today's QAOA implementation impractical since it is hard to obtain good solutions with a reasonably-acceptable time/resources. Considering these facts, this work presents a new approach with two main contributions, including (I) removing the need for accessing quantum devices or large-sized classical machines during the QAOA optimization phase, and (II) ensuring that when dealing with some k-bounded pseudo-Boolean problems, optimizing the exact problem's expectation value can be done in polynomial time using a classical computer.