Received March 1, 2024; accepted May 26, 2024 | Zhongqi Hao1, Ke Liu2,8, Qianlin Lian2,6, Weiran Song3, Zongyu Hou3, Rui Zhang2,7, Qianqian Wang1, Chen Sun5, Xiangyou Li2,†, Zhe Wang3,‡
Laser-induced breakdown spectroscopy (LIBS) is a powerful atomic emission spectroscopy technique with significant applications in various fields due to its simplicity, fast detection speed, and minimal sample damage. However, the spatial inhomogeneity and temporal variability of the laser-induced plasma make it challenging to achieve high-repeatability signal collection and accurate quantification. Traditional physical principle-based calibration models often fail to compensate for matrix effects and signal fluctuations, leading to unsatisfactory results.
Machine learning (ML) offers a promising approach to address these issues by establishing multivariate regression models that can better correlate complex LIBS spectral data with qualitative and quantitative compositions. This review focuses on two main aspects: data preprocessing for ML models, including spectral selection, variable reconstruction, and denoising, and ML methods for improving quantification performance while reducing the impact of matrix effects and spectral fluctuations.
The review highlights the progress in ML applications in LIBS, discusses the challenges such as limited training data, the disconnect between physical principles and algorithms, low generalization ability, and the need for robust data processing. It also outlines future research directions and suggests that machine learning algorithms based on physical principles could be a key solution for improving LIBS quantification, especially with large datasets available.
Keywords: laser-induced breakdown spectroscopy, machine learning, repeatability, matrix effects, qualitative and quantitative analysisLaser-induced breakdown spectroscopy (LIBS) is a powerful atomic emission spectroscopy technique with significant applications in various fields due to its simplicity, fast detection speed, and minimal sample damage. However, the spatial inhomogeneity and temporal variability of the laser-induced plasma make it challenging to achieve high-repeatability signal collection and accurate quantification. Traditional physical principle-based calibration models often fail to compensate for matrix effects and signal fluctuations, leading to unsatisfactory results.
Machine learning (ML) offers a promising approach to address these issues by establishing multivariate regression models that can better correlate complex LIBS spectral data with qualitative and quantitative compositions. This review focuses on two main aspects: data preprocessing for ML models, including spectral selection, variable reconstruction, and denoising, and ML methods for improving quantification performance while reducing the impact of matrix effects and spectral fluctuations.
The review highlights the progress in ML applications in LIBS, discusses the challenges such as limited training data, the disconnect between physical principles and algorithms, low generalization ability, and the need for robust data processing. It also outlines future research directions and suggests that machine learning algorithms based on physical principles could be a key solution for improving LIBS quantification, especially with large datasets available.
Keywords: laser-induced breakdown spectroscopy, machine learning, repeatability, matrix effects, qualitative and quantitative analysis