Quantum error thresholds for gauge-redundant digitizations of lattice field theories

Quantum error thresholds for gauge-redundant digitizations of lattice field theories

February 27, 2024 | Marcela Carena, Henry Lamm, Ying-Ying Li, and Wangqiang Liu
This paper investigates the quantum error thresholds for gauge-redundant digitizations of lattice field theories. The authors consider the correctable errors for generic finite gauge groups and design quantum circuits to detect and correct them. They calculate the error thresholds below which the gauge-redundant digitization with Gauss's law error correction has better fidelity than the gauge-fixed digitization. The results provide guidance for fault-tolerant quantum simulations of lattice gauge theories. Gauge symmetries in quantum field theories give rise to rich phenomena, and lattice gauge theories are used to simulate these phenomena. However, problems involving dynamics such as out-of-equilibrium evolution in the early universe, transport coefficients of the quark-gluon plasma, and parton physics in hadron collisions present sign problems. Future large-scale quantum computers can avoid this obstacle by performing real-time simulations in the Hamiltonian formalism. To use quantum computers for simulations, the infinite-dimensional Hilbert space of the gauge theory must be addressed. Many digitization proposals have been studied to truncate this space. All truncations break the continuous symmetries to some degree and produce theories with smaller symmetries. Understanding the theoretical errors introduced by this is an area of active research. The authors consider two classes of methods to encode these regularized theories in quantum computers. The first class digitizes all the states connected through gauge transformations as redundancy and uses Gauss's law to project the gauge-invariant subspace. The second class digitizes only gauge-invariant states. The authors find that gauge fixing can complicate the Hamiltonian but reduces the qubit cost. The effect of noisy hardware has been generally neglected in the above discussion. The authors find that gauge-redundant digitizations suffer more severely from quantum noise because more qubits introduce more errors. However, not all the errors are equally harmful. Noise which breaks symmetries can change the universality class simulated and thus sufficient symmetry preservation is crucial. The authors find that gauge-violating errors can be mitigated by introducing energy penalties or random gauge transformations in the Hamiltonian evolution. They can also be corrected by measuring and restoring Gauss's law. The gauge-fixed digitization does not have such natural error mitigation or correction methods and thus relies on generic methods for the residual errors. The authors find that redundancy and symmetry play a central role in both quantum error correction (QEC) and mitigation (QEM). QEC deliberately designs a redundant full Hilbert space on the physical qudits and encodes quantum information in a much smaller code subspace on logical qudits with certain symmetries, thus allowing for correction without disrupting the coherent quantum information in the code subspace. The authors find that the threshold for the error rate of physical qubits, below which more redundancy makes the code more error-proof, is a key factor in determining the effectiveness of gauge-redundant digitizations. They compute the threshold below which gauge redundancy makes the digitization more robust. They find thatThis paper investigates the quantum error thresholds for gauge-redundant digitizations of lattice field theories. The authors consider the correctable errors for generic finite gauge groups and design quantum circuits to detect and correct them. They calculate the error thresholds below which the gauge-redundant digitization with Gauss's law error correction has better fidelity than the gauge-fixed digitization. The results provide guidance for fault-tolerant quantum simulations of lattice gauge theories. Gauge symmetries in quantum field theories give rise to rich phenomena, and lattice gauge theories are used to simulate these phenomena. However, problems involving dynamics such as out-of-equilibrium evolution in the early universe, transport coefficients of the quark-gluon plasma, and parton physics in hadron collisions present sign problems. Future large-scale quantum computers can avoid this obstacle by performing real-time simulations in the Hamiltonian formalism. To use quantum computers for simulations, the infinite-dimensional Hilbert space of the gauge theory must be addressed. Many digitization proposals have been studied to truncate this space. All truncations break the continuous symmetries to some degree and produce theories with smaller symmetries. Understanding the theoretical errors introduced by this is an area of active research. The authors consider two classes of methods to encode these regularized theories in quantum computers. The first class digitizes all the states connected through gauge transformations as redundancy and uses Gauss's law to project the gauge-invariant subspace. The second class digitizes only gauge-invariant states. The authors find that gauge fixing can complicate the Hamiltonian but reduces the qubit cost. The effect of noisy hardware has been generally neglected in the above discussion. The authors find that gauge-redundant digitizations suffer more severely from quantum noise because more qubits introduce more errors. However, not all the errors are equally harmful. Noise which breaks symmetries can change the universality class simulated and thus sufficient symmetry preservation is crucial. The authors find that gauge-violating errors can be mitigated by introducing energy penalties or random gauge transformations in the Hamiltonian evolution. They can also be corrected by measuring and restoring Gauss's law. The gauge-fixed digitization does not have such natural error mitigation or correction methods and thus relies on generic methods for the residual errors. The authors find that redundancy and symmetry play a central role in both quantum error correction (QEC) and mitigation (QEM). QEC deliberately designs a redundant full Hilbert space on the physical qudits and encodes quantum information in a much smaller code subspace on logical qudits with certain symmetries, thus allowing for correction without disrupting the coherent quantum information in the code subspace. The authors find that the threshold for the error rate of physical qubits, below which more redundancy makes the code more error-proof, is a key factor in determining the effectiveness of gauge-redundant digitizations. They compute the threshold below which gauge redundancy makes the digitization more robust. They find that
Reach us at info@study.space