17 Jan 2024 | Kouhei Nakaji, Lasse Bjørn Kristensen, Jorge A. Campos-Gonzalez-Angulo, Mohammad Ghazi Vakili, Haoze Huang, Mohsen Bagherimehrab, Christoph Gorgulla, FuTe Wong, Alex McCaskey, Jin-Sung Kim, Thien Nguyen, Pooja Rao, Alan Aspuru-Guzik
The generative quantum eigensolver (GQE) is a novel method that applies classical generative models to quantum simulation. It optimizes a classical generative model to produce quantum circuits with desired properties. The paper introduces a transformer-based implementation called the generative pretrained transformer-based quantum eigensolver (GPT-QE), which leverages both pre-training on existing datasets and training without prior knowledge. The effectiveness of GPT-QE in searching for ground states of electronic structure Hamiltonians is demonstrated. GQE strategies can extend beyond Hamiltonian simulation into other quantum computing applications.
The paper discusses the development of GQE, focusing on the transformer architecture, which is used to generate quantum circuits by analogy with natural language documents. The GPT-QE approach uses a transformer to generate sequences of unitary operations that form quantum circuits. The method is trained to minimize the energy associated with the generated quantum states. The GPT-QE approach offers advantages such as ease of optimization, quantum resource efficiency, and customizability.
The paper presents results showing that GPT-QE can effectively approximate ground states of molecular Hamiltonians such as H2, LiH, BeH2, and N2. The results demonstrate that GPT-QE finds quantum states with energy close to the ground state. The effectiveness of the training scheme is confirmed through the average deviation from random circuit generation. Pre-training is also shown to be effective, allowing the transformer to be trained without operating a quantum device and significantly reducing the number of quantum circuit runs.
The paper concludes that GQE is a promising approach for quantum computing, with potential applications beyond ground-state approximation. It highlights the importance of pre-training for data storage and sharing as common resources. The paper also discusses future research directions, including validating the performance of GQE on actual quantum devices, integrating GQE with the VQE framework, and exploring the use of GQE in supervised machine learning problems.The generative quantum eigensolver (GQE) is a novel method that applies classical generative models to quantum simulation. It optimizes a classical generative model to produce quantum circuits with desired properties. The paper introduces a transformer-based implementation called the generative pretrained transformer-based quantum eigensolver (GPT-QE), which leverages both pre-training on existing datasets and training without prior knowledge. The effectiveness of GPT-QE in searching for ground states of electronic structure Hamiltonians is demonstrated. GQE strategies can extend beyond Hamiltonian simulation into other quantum computing applications.
The paper discusses the development of GQE, focusing on the transformer architecture, which is used to generate quantum circuits by analogy with natural language documents. The GPT-QE approach uses a transformer to generate sequences of unitary operations that form quantum circuits. The method is trained to minimize the energy associated with the generated quantum states. The GPT-QE approach offers advantages such as ease of optimization, quantum resource efficiency, and customizability.
The paper presents results showing that GPT-QE can effectively approximate ground states of molecular Hamiltonians such as H2, LiH, BeH2, and N2. The results demonstrate that GPT-QE finds quantum states with energy close to the ground state. The effectiveness of the training scheme is confirmed through the average deviation from random circuit generation. Pre-training is also shown to be effective, allowing the transformer to be trained without operating a quantum device and significantly reducing the number of quantum circuit runs.
The paper concludes that GQE is a promising approach for quantum computing, with potential applications beyond ground-state approximation. It highlights the importance of pre-training for data storage and sharing as common resources. The paper also discusses future research directions, including validating the performance of GQE on actual quantum devices, integrating GQE with the VQE framework, and exploring the use of GQE in supervised machine learning problems.