Transferability and Accuracy of Ionic Liquid Simulations with Equivariant Machine Learning Interatomic Potentials

Transferability and Accuracy of Ionic Liquid Simulations with Equivariant Machine Learning Interatomic Potentials

15 Jul 2024 | Zachary A. H. Goodwin, Malia B. Wenny, Julia H. Yang, Andrea Cepellotti, Jingxuan Ding, Kyle Bystrom, Blake R. Duschatko, Anders Johansson, Lixin Sun, Simon Batzner, Albert Musaelian, Jarad A. Mason, Boris Kozinsky, and Nicola Molinari
This study investigates the transferability and accuracy of machine learning interatomic potentials (MLIPs) for simulating ionic liquids (ILs). The research demonstrates that MLIPs can be trained to be compositionally transferable, meaning they can accurately predict properties of mixtures not directly trained on, using only a few trained compositions. The study also evaluates the accuracy of MLIPs for a novel IL, [F-OMIM]⁺[C₄F₉CO₂]⁻, synthesized and characterized experimentally. The MLIP trained on approximately 200 DFT frames shows reasonable agreement with experimental and DFT results. The study highlights the importance of training MLIPs on "high-entropy" compositions, such as Li-salt molar fractions of 0.4-0.6, to ensure interactions are well sampled. This approach leads to better transferability and accuracy. The research also explores the role of different compositions in training MLIPs and the impact of data generation methods on model performance. The results show that training on multiple compositions, especially high-entropy ones, significantly improves the model's ability to generalize to new compositions. The study compares the performance of two MLIPs, NequIP and Allegro, in simulating ILs. Allegro, a simplified version of NequIP, is found to be more efficient and effective for large systems. The results indicate that Allegro can achieve accurate predictions with a smaller dataset, making it suitable for simulating complex mixtures of ILs and solutes. The research also addresses the challenges of energy generalization in MLIPs, showing that the energy predictions are more sensitive to the number of compositions used for training. The study provides guidelines for training MLIPs on complex mixtures, emphasizing the need to sample compositions that thoroughly cover all interactions. These findings contribute to the development of more accurate and transferable MLIPs for simulating ILs and their mixtures, which is crucial for applications in energy storage, solvents, and chemical reactions.This study investigates the transferability and accuracy of machine learning interatomic potentials (MLIPs) for simulating ionic liquids (ILs). The research demonstrates that MLIPs can be trained to be compositionally transferable, meaning they can accurately predict properties of mixtures not directly trained on, using only a few trained compositions. The study also evaluates the accuracy of MLIPs for a novel IL, [F-OMIM]⁺[C₄F₉CO₂]⁻, synthesized and characterized experimentally. The MLIP trained on approximately 200 DFT frames shows reasonable agreement with experimental and DFT results. The study highlights the importance of training MLIPs on "high-entropy" compositions, such as Li-salt molar fractions of 0.4-0.6, to ensure interactions are well sampled. This approach leads to better transferability and accuracy. The research also explores the role of different compositions in training MLIPs and the impact of data generation methods on model performance. The results show that training on multiple compositions, especially high-entropy ones, significantly improves the model's ability to generalize to new compositions. The study compares the performance of two MLIPs, NequIP and Allegro, in simulating ILs. Allegro, a simplified version of NequIP, is found to be more efficient and effective for large systems. The results indicate that Allegro can achieve accurate predictions with a smaller dataset, making it suitable for simulating complex mixtures of ILs and solutes. The research also addresses the challenges of energy generalization in MLIPs, showing that the energy predictions are more sensitive to the number of compositions used for training. The study provides guidelines for training MLIPs on complex mixtures, emphasizing the need to sample compositions that thoroughly cover all interactions. These findings contribute to the development of more accurate and transferable MLIPs for simulating ILs and their mixtures, which is crucial for applications in energy storage, solvents, and chemical reactions.
Reach us at info@study.space