A new approach to the gradient-free optimizer uses the approximated gradient from SPSA in combination with state-of-the-art gradient-based classical optimizers.
Variational quantum algorithms (VQAs) have attracted a lot of attention from the quantum computing community for the last few years. Their hybrid quantum-classical nature with relatively shallow quantum circuits makes them a promising platform for demonstrating the capabilities of noisy intermediate scale quantum (NISQ) devices. Although the classical machine learning community focuses on gradient-based parameter optimization, finding near-exact gradients for variational quantum circuits (VQCs) with the parameter-shift rule introduces a large sampling overhead. Therefore, gradient-free optimizers have gained popularity in quantum machine learning circles. Among the most promising candidates is the simultaneous perturbation stochastic approximation (SPSA) algorithm, due to its low computational cost and inherent noise resilience. We introduce a novel approach that uses the approximated gradient from SPSA in combination with state-of-the-art gradient-based classical optimizers. We demonstrate numerically that this out-performs both standard SPSA and the parameter-shift rule in terms of convergence rate and absolute error in simple regression tasks. The improvement of our novel approach over SPSA with stochastic gradient descent (SGD) is even amplified when shot- and hardware-noise are taken into account. We also demonstrate that error mitigation does not significantly affect our results.
Published in: 2023 IEEE International Conference on Quantum Computing and Engineering (QCE)
Date of Conference: 17-22 September 2023
Date Added to IEEE Xplore: 30 November 2023
DOI: 10.1109/QCE57702.2023.00058
Publisher: IEEE
Authors:
Marco Wiedmann, Marc Hölle, Maniraman Periyasamy, Nico Meyer, Christian Ufrecht, Daniel D. Scherer, Axel Plinge, and Christopher Mutschler.
Fraunhofer IIS, Fraunhofer Institute for Integrated Circuits IIS, Nuremberg, Germany.
Click here to read more.
Leave a Reply