加权MaxCut量子近似优化的参数传递

Ruslan Shaydulin, Phillip C. Lotshaw, Jeffrey Larson, James Ostrowski, T. Humble
{"title":"加权MaxCut量子近似优化的参数传递","authors":"Ruslan Shaydulin, Phillip C. Lotshaw, Jeffrey Larson, James Ostrowski, T. Humble","doi":"10.1145/3584706","DOIUrl":null,"url":null,"abstract":"Finding high-quality parameters is a central obstacle to using the quantum approximate optimization algorithm (QAOA). Previous work partially addresses this issue for QAOA on unweighted MaxCut problems by leveraging similarities in the objective landscape among different problem instances. However, we show that the more general weighted MaxCut problem has significantly modified objective landscapes, with a proliferation of poor local optima. Our main contribution is a simple rescaling scheme that overcomes these deleterious effects of weights. We show that for a given QAOA depth, a single “typical” vector of QAOA parameters can be successfully transferred to weighted MaxCut instances. This transfer leads to a median decrease in the approximation ratio of only 2.0 percentage points relative to a considerably more expensive direct optimization on a dataset of 34,701 instances with up to 20 nodes and multiple weight distributions. This decrease can be reduced to 1.2 percentage points at the cost of only 10 additional QAOA circuit evaluations with parameters sampled from a pretrained metadistribution, or the transferred parameters can be used as a starting point for a single local optimization run to obtain approximation ratios equivalent to those achieved by exhaustive optimization in 96.35% of our cases.","PeriodicalId":365166,"journal":{"name":"ACM Transactions on Quantum Computing","volume":"27 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-01-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"40","resultStr":"{\"title\":\"Parameter Transfer for Quantum Approximate Optimization of Weighted MaxCut\",\"authors\":\"Ruslan Shaydulin, Phillip C. Lotshaw, Jeffrey Larson, James Ostrowski, T. Humble\",\"doi\":\"10.1145/3584706\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Finding high-quality parameters is a central obstacle to using the quantum approximate optimization algorithm (QAOA). Previous work partially addresses this issue for QAOA on unweighted MaxCut problems by leveraging similarities in the objective landscape among different problem instances. However, we show that the more general weighted MaxCut problem has significantly modified objective landscapes, with a proliferation of poor local optima. Our main contribution is a simple rescaling scheme that overcomes these deleterious effects of weights. We show that for a given QAOA depth, a single “typical” vector of QAOA parameters can be successfully transferred to weighted MaxCut instances. This transfer leads to a median decrease in the approximation ratio of only 2.0 percentage points relative to a considerably more expensive direct optimization on a dataset of 34,701 instances with up to 20 nodes and multiple weight distributions. This decrease can be reduced to 1.2 percentage points at the cost of only 10 additional QAOA circuit evaluations with parameters sampled from a pretrained metadistribution, or the transferred parameters can be used as a starting point for a single local optimization run to obtain approximation ratios equivalent to those achieved by exhaustive optimization in 96.35% of our cases.\",\"PeriodicalId\":365166,\"journal\":{\"name\":\"ACM Transactions on Quantum Computing\",\"volume\":\"27 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-01-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"40\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACM Transactions on Quantum Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3584706\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Transactions on Quantum Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3584706","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 40

摘要

寻找高质量的参数是使用量子近似优化算法(QAOA)的主要障碍。以前的工作通过利用不同问题实例之间客观环境的相似性,部分地解决了QAOA在未加权MaxCut问题上的这个问题。然而,我们表明,更一般的加权MaxCut问题已经显著地改变了客观景观,导致了糟糕的局部最优的扩散。我们的主要贡献是一个简单的重新缩放方案,它克服了权重的这些有害影响。我们表明,对于给定的QAOA深度,单个QAOA参数的“典型”向量可以成功地转移到加权的MaxCut实例。相对于对34,701个实例的数据集(最多有20个节点和多个权重分布)进行更昂贵的直接优化,这种转移导致近似比率的中位数下降仅为2.0个百分点。这种下降可以减少到1.2个百分点,代价是只需要10个额外的QAOA电路评估,从预训练的元分布中采样参数,或者转移的参数可以用作单个局部优化运行的起点,以获得相当于我们96.35%的情况下穷举优化所获得的近似比率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Parameter Transfer for Quantum Approximate Optimization of Weighted MaxCut
Finding high-quality parameters is a central obstacle to using the quantum approximate optimization algorithm (QAOA). Previous work partially addresses this issue for QAOA on unweighted MaxCut problems by leveraging similarities in the objective landscape among different problem instances. However, we show that the more general weighted MaxCut problem has significantly modified objective landscapes, with a proliferation of poor local optima. Our main contribution is a simple rescaling scheme that overcomes these deleterious effects of weights. We show that for a given QAOA depth, a single “typical” vector of QAOA parameters can be successfully transferred to weighted MaxCut instances. This transfer leads to a median decrease in the approximation ratio of only 2.0 percentage points relative to a considerably more expensive direct optimization on a dataset of 34,701 instances with up to 20 nodes and multiple weight distributions. This decrease can be reduced to 1.2 percentage points at the cost of only 10 additional QAOA circuit evaluations with parameters sampled from a pretrained metadistribution, or the transferred parameters can be used as a starting point for a single local optimization run to obtain approximation ratios equivalent to those achieved by exhaustive optimization in 96.35% of our cases.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信