
arXiv: 2407.12242
The Quantum Approximate Optimization Algorithm (QAOA) is a hybrid quantum-classical algorithm that shows promise in efficiently solving the MaxCut problem, a representative example of combinatorial optimization. However, its effectiveness heavily depends on the parameter optimization pipeline, where the parameter initialization strategy is nontrivial due to the non-convex and complex optimization landscapes characterized by issues with low-quality local minima. Recent inspiration comes from the diffusion of classical neural network parameters, which has demonstrated that neural network training can benefit from generating good initial parameters through diffusion models. Therefore, in this work, we formulate the problem of finding good initial parameters as a generative task and propose the initial parameter generation scheme through dataset-conditioned pre-trained parameter sampling. Concretely, the generative machine learning model, specifically the denoising diffusion probabilistic model (DDPM), is trained to learn the distribution of pretrained parameters conditioned on the graph dataset. Intuitively, our proposed framework aims at effectively distilling knowledge from pre-trained parameters to generate well-performing initial parameters for QAOA. Compared to random parameter initialization, experiments on various-sized Max-Cut problem instances consistently show that our conditional DDPM is capable of improving the approximation ratio by as much as 14.4%, 11.0%, 11.4% and 7.49%, 8.31%, 6.08% on average for random, regular, and Watts-Strogatz graphs, respectively. Additionally, the experimental results also indicate that the conditional DDPM trained on small problem instances can be extrapolated to larger ones, improving the approximation ratio by up to 28.4% and 12.1% on average.
Quantum Physics
Quantum Physics
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
