17 Jan 2024 | Kouhei Nakaji, Lasse Bjørn Kristensen, Jorge A. Campos-Gonzalez-Angulo, Mohammad Ghazi Vakili, Haozhe Huang, Mohsen Bagherimehrab, Christoph Gorgulla, FuTe Wong, Alex McCaskey, Jin-Sung Kim, Thien Nguyen, Pooja Rao, and Alan Aspuru-Guzik
The paper introduces the generative quantum eigensolver (GQE), a novel method that applies classical generative models to quantum circuits for ground state search. Specifically, it focuses on the electronic structure problem, where the Hamiltonian is represented as a weighted sum of tensor products of Pauli operators. The GQE algorithm optimizes a classical generative model to produce quantum circuits with desired properties. The authors develop a transformer-based implementation, named the generative pre-trained transformer-based (GPT) quantum eigensolver (GPT-QE), which leverages both pre-training on existing datasets and training without prior knowledge.
GPT-QE uses a transformer architecture to generate quantum circuits by sampling sequences of unitary operations. The generative model is trained to minimize a cost function that calculates the energy of the generated quantum states. The paper demonstrates the effectiveness of GPT-QE in searching for ground states of electronic structure Hamiltonians, showing that it can find quantum states with energy close to the ground state. The training scheme employs logit-matching, which aligns the sum of logits with the energy function, and pre-training, which uses datasets generated from previous GPT-QE training or other algorithms.
The paper also discusses scenarios for obtaining datasets for pre-training, including model-to-model transfer, config-to-config transfer, and molecule-to-molecule transfer. Numerical experiments on molecular Hamiltonians (H₂, LiH, BeH₂, and N₂) show that GPT-QE effectively identifies low-energy states and that pre-training can significantly reduce the number of quantum circuit runs required for training.
The authors conclude by highlighting the potential of GQE for practical quantum applications and suggest future research directions, including validation on actual quantum devices, integration with VQE, and exploration of machine learning problems beyond ground-state approximation.The paper introduces the generative quantum eigensolver (GQE), a novel method that applies classical generative models to quantum circuits for ground state search. Specifically, it focuses on the electronic structure problem, where the Hamiltonian is represented as a weighted sum of tensor products of Pauli operators. The GQE algorithm optimizes a classical generative model to produce quantum circuits with desired properties. The authors develop a transformer-based implementation, named the generative pre-trained transformer-based (GPT) quantum eigensolver (GPT-QE), which leverages both pre-training on existing datasets and training without prior knowledge.
GPT-QE uses a transformer architecture to generate quantum circuits by sampling sequences of unitary operations. The generative model is trained to minimize a cost function that calculates the energy of the generated quantum states. The paper demonstrates the effectiveness of GPT-QE in searching for ground states of electronic structure Hamiltonians, showing that it can find quantum states with energy close to the ground state. The training scheme employs logit-matching, which aligns the sum of logits with the energy function, and pre-training, which uses datasets generated from previous GPT-QE training or other algorithms.
The paper also discusses scenarios for obtaining datasets for pre-training, including model-to-model transfer, config-to-config transfer, and molecule-to-molecule transfer. Numerical experiments on molecular Hamiltonians (H₂, LiH, BeH₂, and N₂) show that GPT-QE effectively identifies low-energy states and that pre-training can significantly reduce the number of quantum circuit runs required for training.
The authors conclude by highlighting the potential of GQE for practical quantum applications and suggest future research directions, including validation on actual quantum devices, integration with VQE, and exploration of machine learning problems beyond ground-state approximation.