Large-scale quantum reservoir learning with an analog quantum computer

Large-scale quantum reservoir learning with an analog quantum computer

Dated: July 4, 2024 | Milan Kornjaca, Hong-Ye Hu, Chen Zhao, Jonathan Wurtz, Phillip Weinberg, Majd Hamdan, Andrii Zhdanov, Sergio H. Cantu, Hengyun Zhou, Rodrigo Araiza Bravo, Kevin Bagnall, James I. Basham, Joseph Campo, Adam Choukri, Robert DeAngelo, Paige Frederick, David Haines, Julian Hammett, Ning Hsu, Ming-Guang Hu, Florian Huber, Paul Niklas Jepsen, Thomas Karolyshyn, Minho Kwon, John Long, Jonathan Lopatin, Alexander Lukin, Tommaso Macrì, Ognjen Marković, Luis A. Martínez-Martínez, Xianmei Meng, Evgeny Ostroumov, David Paquette, John Robinson, Pedro Sales Rodriguez, Anshuman Singh, Nandan Sinha, Henry Thoreen, Noel Wan, Daniel Waxman-Lenz, Tak Wong, Kai-Hsin Wu, Pedro L. S. Lopes, Yuval Boger, Nathan Gemelke, Takuya Kitagawa, Alexander Keesling, Xun Gao, Alexei Bylinskii, Susanne F. Yelin, Fangli Liu, and Sheng-Tao Wang
The paper presents a large-scale quantum reservoir learning algorithm implemented on an analog quantum computer, leveraging the quantum dynamics of neutral-atom systems. The algorithm is designed to be general-purpose, gradient-free, and scalable, addressing the resource-intensive and gradient-related challenges of contemporary quantum machine learning methods. Key advancements include: 1. **Experimental Implementation**: The algorithm is experimentally implemented on a publicly accessible analog quantum computer, achieving competitive performance across various machine learning tasks, including binary and multi-class classification, and timeseries prediction. 2. **Large-Scale Experiment**: The system size is expanded to up to 108 qubits, representing the largest quantum machine learning experiment to date, demonstrating the potential of quantum correlations for effective machine learning. 3. **Quantum Kernel Advantage**: Comparative quantum kernel advantage is observed in learning tasks by constructing synthetic datasets based on the geometric differences between quantum and classical data kernels, showing that non-classical correlations of the quantum reservoir can be utilized for effective machine learning even on current, noisy quantum hardware. 4. **Versatility and Noise Resilience**: The QRC framework is shown to be versatile and noise-resilient, with performance improving with increasing system sizes. The universal parameter regime, informed by physical insights, eliminates the need for parameter optimization in the quantum part, saving significant quantum resources. 5. **Application to Various Tasks**: The algorithm is applied to image classification tasks, including binary and 10-class classifications of MNIST handwritten digits and 3-class classification of tomato leaf disease images, achieving performance comparable to classical methods and even surpassing them in certain cases. 6. **Future Directions**: The findings open avenues for further exploration, including scaling up experimental sampling rates and system sizes, tailoring the algorithm to different platforms, and investigating the comparative quantum kernel advantage in more datasets. The results highlight the potential of quantum reservoir computing as a powerful and resource-efficient approach for machine learning tasks, with implications for both classical and quantum hardware and machine learning paradigms.The paper presents a large-scale quantum reservoir learning algorithm implemented on an analog quantum computer, leveraging the quantum dynamics of neutral-atom systems. The algorithm is designed to be general-purpose, gradient-free, and scalable, addressing the resource-intensive and gradient-related challenges of contemporary quantum machine learning methods. Key advancements include: 1. **Experimental Implementation**: The algorithm is experimentally implemented on a publicly accessible analog quantum computer, achieving competitive performance across various machine learning tasks, including binary and multi-class classification, and timeseries prediction. 2. **Large-Scale Experiment**: The system size is expanded to up to 108 qubits, representing the largest quantum machine learning experiment to date, demonstrating the potential of quantum correlations for effective machine learning. 3. **Quantum Kernel Advantage**: Comparative quantum kernel advantage is observed in learning tasks by constructing synthetic datasets based on the geometric differences between quantum and classical data kernels, showing that non-classical correlations of the quantum reservoir can be utilized for effective machine learning even on current, noisy quantum hardware. 4. **Versatility and Noise Resilience**: The QRC framework is shown to be versatile and noise-resilient, with performance improving with increasing system sizes. The universal parameter regime, informed by physical insights, eliminates the need for parameter optimization in the quantum part, saving significant quantum resources. 5. **Application to Various Tasks**: The algorithm is applied to image classification tasks, including binary and 10-class classifications of MNIST handwritten digits and 3-class classification of tomato leaf disease images, achieving performance comparable to classical methods and even surpassing them in certain cases. 6. **Future Directions**: The findings open avenues for further exploration, including scaling up experimental sampling rates and system sizes, tailoring the algorithm to different platforms, and investigating the comparative quantum kernel advantage in more datasets. The results highlight the potential of quantum reservoir computing as a powerful and resource-efficient approach for machine learning tasks, with implications for both classical and quantum hardware and machine learning paradigms.
Reach us at info@study.space
[slides and audio] Large-scale quantum reservoir learning with an analog quantum computer