Finding efficient optimization methods plays an important role for quantum optimization and quantum machine learning on near-term quantum computers. While backpropagation on classical computers is computationally efficient, obtaining gradients on quantum computers is not, because the computational complexity scales linearly with the number of parameters and measurements. In this talk, we connect Koopman operator theory, which has been successful in predicting nonlinear dynamics, with natural gradient methods in quantum optimization. We propose a data-driven approach using Koopman operator learning to accelerate quantum optimization and quantum machine learning. Because a naive application of Koopman operator learning fails, we develop a simple algorithm of alternating application of two new families of methods that we propose: the sliding-window dynamic mode decomposition (DMD) and the neural DMD. The resulting Quantum-circuit Alternating Controlled Koopman operator learning algorithm (QuACK) efficiently updates parameters on quantum computers. We show that our methods can predict gradient dynamics on quantum computers and accelerate the variational quantum eigensolver used in quantum optimization including applications to various physics models and quantum chemistry, as well as quantum machine learning. We further implement our Koopman operator learning algorithms on a real IBM quantum computer and demonstrate their practical effectiveness.