Tuan Hai Vu;Vu Trung Duong Le;Hoai Luan Pham;Yasuhiko Nakashima
{"title":"Benchmarking Variants of the Adam Optimizer for Quantum Machine Learning Applications","authors":"Tuan Hai Vu;Vu Trung Duong Le;Hoai Luan Pham;Yasuhiko Nakashima","doi":"10.1109/OJCS.2025.3586953","DOIUrl":null,"url":null,"abstract":"Quantum Machine Learning is gaining traction by leveraging quantum advantage to outperform classical Machine Learning. Many classical and quantum optimizers have been proposed to train Parameterized Quantum Circuits in the simulation environment, achieving high accuracy and fast convergence speed. However, to the best of our knowledge, currently there is no related work investigating these optimizers on multiple algorithms, which may lead to the selection of suboptimal optimizers. In this article, we first benchmark the most popular classical and quantum optimizers, such as Gradient Descent (GD), Adaptive Moment Estimation (Adam), and Quantum Natural Gradient Descent (QNG), through the Quantum Compilation algorithm. Evaluated metrics include the lowest cost value and the wall time. The results indicate that Adam outperforms other optimizers in terms of convergence speed, cost value, and stability. Furthermore, we conduct additional experiments on multiple algorithms with Adam variants, demonstrating that the choice of hyperparameters significantly impacts the optimizer’s performance.","PeriodicalId":13205,"journal":{"name":"IEEE Open Journal of the Computer Society","volume":"6 ","pages":"1146-1154"},"PeriodicalIF":0.0000,"publicationDate":"2025-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11072814","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Open Journal of the Computer Society","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11072814/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Quantum Machine Learning is gaining traction by leveraging quantum advantage to outperform classical Machine Learning. Many classical and quantum optimizers have been proposed to train Parameterized Quantum Circuits in the simulation environment, achieving high accuracy and fast convergence speed. However, to the best of our knowledge, currently there is no related work investigating these optimizers on multiple algorithms, which may lead to the selection of suboptimal optimizers. In this article, we first benchmark the most popular classical and quantum optimizers, such as Gradient Descent (GD), Adaptive Moment Estimation (Adam), and Quantum Natural Gradient Descent (QNG), through the Quantum Compilation algorithm. Evaluated metrics include the lowest cost value and the wall time. The results indicate that Adam outperforms other optimizers in terms of convergence speed, cost value, and stability. Furthermore, we conduct additional experiments on multiple algorithms with Adam variants, demonstrating that the choice of hyperparameters significantly impacts the optimizer’s performance.