Variational quantum simulation: a case study for understanding warm starts

Variational quantum simulation: a case study for understanding warm starts

June 26, 2024 | Ricard Puig,1,* Marc Drudis,1,2,* Supanut Thanasilp,1,3 and Zoë Holmes1
The paper explores the potential of warm starts in variational quantum algorithms, focusing on an iterative method for learning shorter-depth circuits for quantum real and imaginary time evolution. The authors analyze the trainability of this method by proving that the algorithm exhibits substantial gradients in a small region around the initializations at each time step, with convexity guarantees suggesting trainability for polynomially sized time steps. However, they also highlight scenarios where a good minimum shifts outside the region with trainability guarantees, raising questions about the necessity of optimization across barren plateau landscapes or the existence of fertile valleys with substantial gradients. The study provides analytical bounds on the loss variance and convexity, and discusses the adiabatic minimum and the possibility of minimum jumps, suggesting that while warm starts can improve trainability, the overall landscape's structure remains a critical factor. Numerical evidence supports the existence of gradient flows in fertile valleys, indicating potential for successful training even in the presence of barren plateaus.The paper explores the potential of warm starts in variational quantum algorithms, focusing on an iterative method for learning shorter-depth circuits for quantum real and imaginary time evolution. The authors analyze the trainability of this method by proving that the algorithm exhibits substantial gradients in a small region around the initializations at each time step, with convexity guarantees suggesting trainability for polynomially sized time steps. However, they also highlight scenarios where a good minimum shifts outside the region with trainability guarantees, raising questions about the necessity of optimization across barren plateau landscapes or the existence of fertile valleys with substantial gradients. The study provides analytical bounds on the loss variance and convexity, and discusses the adiabatic minimum and the possibility of minimum jumps, suggesting that while warm starts can improve trainability, the overall landscape's structure remains a critical factor. Numerical evidence supports the existence of gradient flows in fertile valleys, indicating potential for successful training even in the presence of barren plateaus.
Reach us at info@study.space
[slides and audio] Variational quantum simulation%3A a case study for understanding warm starts