{"title":"定律:量子神经网络的环顾四周和热启动自然梯度下降","authors":"Zeyi Tao, Jindi Wu, Qi Xia, Qun Li","doi":"10.1109/QSW59989.2023.00019","DOIUrl":null,"url":null,"abstract":"Variational quantum algorithms (VQAs) have recently received much attention due to their promising performance in Noisy Intermediate-Scale Quantum computers (NISQ). However, VQAs run on parameterized quantum circuits (PQC) with randomly initialized parameters are characterized by barren plateaus (BP) where the gradient vanishes exponentially in the number of qubits. In this paper, we proposed a Look Around Warm-Start (LAWS) quantum natural gradient (QNG) algorithm to mitigate the widespread existing BP issues. LAWS is a combinatorial optimization strategy taking advantage of model parameter initialization and fast convergence of QNG. LAWS repeatedly reinitializes parameter search space for the next iteration parameter update. The reinitialized parameter search space is carefully chosen by sampling the gradient close to the current optimal. Moreover, we present a unified framework (WS-SGD) for integrating parameter initialization techniques into the optimizer. We provide the convergence proof of the proposed framework for both convex and non-convex objective functions based on Polyak-Lojasiewicz (PL) condition. Our experiment results show that the proposed algorithm could mitigate the BP and have better generalization ability in quantum classification problems.","PeriodicalId":254476,"journal":{"name":"2023 IEEE International Conference on Quantum Software (QSW)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-05-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"LAWS: Look Around and Warm-Start Natural Gradient Descent for Quantum Neural Networks\",\"authors\":\"Zeyi Tao, Jindi Wu, Qi Xia, Qun Li\",\"doi\":\"10.1109/QSW59989.2023.00019\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Variational quantum algorithms (VQAs) have recently received much attention due to their promising performance in Noisy Intermediate-Scale Quantum computers (NISQ). However, VQAs run on parameterized quantum circuits (PQC) with randomly initialized parameters are characterized by barren plateaus (BP) where the gradient vanishes exponentially in the number of qubits. In this paper, we proposed a Look Around Warm-Start (LAWS) quantum natural gradient (QNG) algorithm to mitigate the widespread existing BP issues. LAWS is a combinatorial optimization strategy taking advantage of model parameter initialization and fast convergence of QNG. LAWS repeatedly reinitializes parameter search space for the next iteration parameter update. The reinitialized parameter search space is carefully chosen by sampling the gradient close to the current optimal. Moreover, we present a unified framework (WS-SGD) for integrating parameter initialization techniques into the optimizer. We provide the convergence proof of the proposed framework for both convex and non-convex objective functions based on Polyak-Lojasiewicz (PL) condition. Our experiment results show that the proposed algorithm could mitigate the BP and have better generalization ability in quantum classification problems.\",\"PeriodicalId\":254476,\"journal\":{\"name\":\"2023 IEEE International Conference on Quantum Software (QSW)\",\"volume\":\"9 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-05-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 IEEE International Conference on Quantum Software (QSW)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/QSW59989.2023.00019\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE International Conference on Quantum Software (QSW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/QSW59989.2023.00019","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
摘要
变分量子算法(VQAs)由于其在有噪声的中等规模量子计算机(NISQ)中具有良好的性能,近年来受到了广泛的关注。然而,运行在参数化量子电路(PQC)上的vqa具有随机初始化参数的特征,其特征是贫瘠高原(BP),其中梯度随量子位数呈指数级消失。在本文中,我们提出了一种Look Around Warm-Start (LAWS)量子自然梯度(QNG)算法来缓解广泛存在的BP问题。LAWS是一种利用模型参数初始化和QNG快速收敛的组合优化策略。LAWS为下一次迭代参数更新反复重新初始化参数搜索空间。重新初始化的参数搜索空间是通过对接近当前最优的梯度进行采样来仔细选择的。此外,我们提出了一个将参数初始化技术集成到优化器中的统一框架(WS-SGD)。我们给出了基于Polyak-Lojasiewicz (PL)条件的凸和非凸目标函数框架的收敛性证明。实验结果表明,该算法在量子分类问题中具有较好的泛化能力。
LAWS: Look Around and Warm-Start Natural Gradient Descent for Quantum Neural Networks
Variational quantum algorithms (VQAs) have recently received much attention due to their promising performance in Noisy Intermediate-Scale Quantum computers (NISQ). However, VQAs run on parameterized quantum circuits (PQC) with randomly initialized parameters are characterized by barren plateaus (BP) where the gradient vanishes exponentially in the number of qubits. In this paper, we proposed a Look Around Warm-Start (LAWS) quantum natural gradient (QNG) algorithm to mitigate the widespread existing BP issues. LAWS is a combinatorial optimization strategy taking advantage of model parameter initialization and fast convergence of QNG. LAWS repeatedly reinitializes parameter search space for the next iteration parameter update. The reinitialized parameter search space is carefully chosen by sampling the gradient close to the current optimal. Moreover, we present a unified framework (WS-SGD) for integrating parameter initialization techniques into the optimizer. We provide the convergence proof of the proposed framework for both convex and non-convex objective functions based on Polyak-Lojasiewicz (PL) condition. Our experiment results show that the proposed algorithm could mitigate the BP and have better generalization ability in quantum classification problems.