2022 | Simon Batzner, Albert Musaelian, Lixin Sun, Mario Geiger, Jonathan P. Mailoa, Mordechai Kornbluth, Nicola Molinari, Tess E. Smidt & Boris Kozinsky
This work introduces NequIP, an E(3)-equivariant neural network approach for learning interatomic potentials from ab-initio calculations for molecular dynamics simulations. Unlike traditional symmetry-aware models that use invariant convolutions on scalars, NequIP employs E(3)-equivariant convolutions on geometric tensors, resulting in a more information-rich and faithful representation of atomic environments. NequIP achieves state-of-the-art accuracy on a diverse set of molecules and materials while demonstrating remarkable data efficiency, outperforming existing models with up to three orders of magnitude fewer training data. This high data efficiency allows for the construction of accurate potentials using high-order quantum chemical reference data and enables high-fidelity molecular dynamics simulations over long time scales.
Molecular dynamics simulations are essential for computational discovery in various fields. While quantum-mechanical calculations like DFT can provide accurate atomic forces, their computational cost limits simulations to short time scales and small systems. Classical models, though faster, are limited in predictive accuracy. Machine learning-based interatomic potentials, particularly neural networks, offer a promising solution by learning high-fidelity potentials from ab-initio reference calculations while maintaining computational efficiency. Unlike classical force fields, machine learning potentials treat all interactions identically based on relative atomic positions and species, without relying on explicit bonded or non-bonded terms.
NequIP is a highly data-efficient deep learning approach for learning interatomic potentials from first-principles calculations. It outperforms existing methods on a wide range of systems, including small molecules, water in different phases, amorphous solids, reaction at solid/gas interfaces, and a lithium superionic conductor. NequIP exhibits exceptional data efficiency, enabling accurate potentials from as few as 100 reference ab-initio calculations, where other methods require orders of magnitude more. It also performs well on small molecular datasets, competing with kernel-based approaches.
NequIP uses E(3)-equivariant convolutions over geometric tensors, allowing for rotation-equivariant features and angular information. This architecture enables high data efficiency and accuracy, as demonstrated by state-of-the-art results on a training set of molecular data obtained at the quantum chemical coupled-cluster level of theory. The method is validated through simulations, showing that it can reproduce structural and kinetic properties computed from NequIP simulations in comparison to ab-initio molecular dynamics simulations (AIMD).
NequIP's architecture is built on an atomic embedding, followed by interaction blocks and an output block. The interaction blocks encode atom-atom interactions using equivariant convolutions, while the output block generates atomic energies. The model is trained on a diverse set of data, including MD-17, quantum chemical accuracy data, and extended systems with periodic boundary conditions. NequIP significantly outperforms invariant GNN-IPs, shallow neural networks, and kernel-based approaches on various benchmark datasets.
NequIP demonstrates exceptional data efficiency, achieving high accuracy with significantly fewer training data. It outperforms other methods onThis work introduces NequIP, an E(3)-equivariant neural network approach for learning interatomic potentials from ab-initio calculations for molecular dynamics simulations. Unlike traditional symmetry-aware models that use invariant convolutions on scalars, NequIP employs E(3)-equivariant convolutions on geometric tensors, resulting in a more information-rich and faithful representation of atomic environments. NequIP achieves state-of-the-art accuracy on a diverse set of molecules and materials while demonstrating remarkable data efficiency, outperforming existing models with up to three orders of magnitude fewer training data. This high data efficiency allows for the construction of accurate potentials using high-order quantum chemical reference data and enables high-fidelity molecular dynamics simulations over long time scales.
Molecular dynamics simulations are essential for computational discovery in various fields. While quantum-mechanical calculations like DFT can provide accurate atomic forces, their computational cost limits simulations to short time scales and small systems. Classical models, though faster, are limited in predictive accuracy. Machine learning-based interatomic potentials, particularly neural networks, offer a promising solution by learning high-fidelity potentials from ab-initio reference calculations while maintaining computational efficiency. Unlike classical force fields, machine learning potentials treat all interactions identically based on relative atomic positions and species, without relying on explicit bonded or non-bonded terms.
NequIP is a highly data-efficient deep learning approach for learning interatomic potentials from first-principles calculations. It outperforms existing methods on a wide range of systems, including small molecules, water in different phases, amorphous solids, reaction at solid/gas interfaces, and a lithium superionic conductor. NequIP exhibits exceptional data efficiency, enabling accurate potentials from as few as 100 reference ab-initio calculations, where other methods require orders of magnitude more. It also performs well on small molecular datasets, competing with kernel-based approaches.
NequIP uses E(3)-equivariant convolutions over geometric tensors, allowing for rotation-equivariant features and angular information. This architecture enables high data efficiency and accuracy, as demonstrated by state-of-the-art results on a training set of molecular data obtained at the quantum chemical coupled-cluster level of theory. The method is validated through simulations, showing that it can reproduce structural and kinetic properties computed from NequIP simulations in comparison to ab-initio molecular dynamics simulations (AIMD).
NequIP's architecture is built on an atomic embedding, followed by interaction blocks and an output block. The interaction blocks encode atom-atom interactions using equivariant convolutions, while the output block generates atomic energies. The model is trained on a diverse set of data, including MD-17, quantum chemical accuracy data, and extended systems with periodic boundary conditions. NequIP significantly outperforms invariant GNN-IPs, shallow neural networks, and kernel-based approaches on various benchmark datasets.
NequIP demonstrates exceptional data efficiency, achieving high accuracy with significantly fewer training data. It outperforms other methods on