
Quantum Machine Learning is gaining traction by leveraging quantum advantage to outperform classical Machine Learning. Many classical and quantum optimizers have been proposed to train Parameterized Quantum Circuits in the simulation environment, achieving high accuracy and fast convergence speed. However, to the best of our knowledge, currently there is no related work investigating these optimizers on multiple algorithms, which may lead to the selection of suboptimal optimizers. In this article, we first benchmark the most popular classical and quantum optimizers, such as Gradient Descent (GD), Adaptive Moment Estimation (Adam), and Quantum Natural Gradient Descent (QNG), through the Quantum Compilation algorithm. Evaluated metrics include the lowest cost value and the wall time. The results indicate that Adam outperforms other optimizers in terms of convergence speed, cost value, and stability. Furthermore, we conduct additional experiments on multiple algorithms with Adam variants, demonstrating that the choice of hyperparameters significantly impacts the optimizer’s performance.
variational quantum eigensolver, Electronic computers. Computer science, dynamic simulation, Optimizer, benchmarking, QA75.5-76.95, Information technology, T58.5-58.64, quantum approximate optimization algorithm, quantum simulator
variational quantum eigensolver, Electronic computers. Computer science, dynamic simulation, Optimizer, benchmarking, QA75.5-76.95, Information technology, T58.5-58.64, quantum approximate optimization algorithm, quantum simulator
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
