Open research towards the discovery of room-temperature superconductors.
Discover other posts like this one
https://github.com/mir-group/nequip
This work presents Neural Equivariant Interatomic Potentials (NequIP), an E(3)-equivariant neural network approach for learning interatomic potentials from ab-initio calculations for molecular dynamics simulations. While most contemporary symmetry-aware models use invariant convolutions and only act on scalars, NequIP employs E(3)-equivariant convolutions for interactions of geometric tensors, resulting in a more information-rich and faithful representation of atomic environments. The method achieves state-of-the-art accuracy on a challenging and diverse set of molecules and materials while exhibiting remarkable data efficiency.
Paper published: 04 May 2022
https://github.com/materialsvirtuallab/matgl?tab=readme-ov-file
This work presents the Crystal Hamiltonian Graph Neural Network (CHGNet), a graph neural network-based machine-learning interatomic potential (MLIP) that models the universal potential energy surface. CHGNet is pretrained on the energies, forces, stresses and magnetic moments from the Materials Project Trajectory Dataset, which consists of over 10 years of density functional theory calculations of more than 1.5 million inorganic structures. The explicit inclusion of magnetic moments enables CHGNet to learn and accurately represent the orbital occupancy of electrons, enhancing its capability to describe both atomic and electronic degrees of freedom.
Paper published: 14 September 2023
https://github.com/deepmodeling/deepmd-kit
DeePMD-kit is a package written in Python/C++, designed to minimize the effort required to build deep learning-based model of interatomic potential energy and force field and to perform molecular dynamics (MD). This brings new hopes to addressing the accuracy-versus-efficiency dilemma in molecular simulations. Applications of DeePMD-kit span from finite molecules to extended systems and from metallic systems to chemically bonded systems.
https://github.com/isayevlab/AIMNet2
In this work, they present the 2nd generation of our atoms-in-molecules neural network potential (AIMNet2), which is applicable to species composed of up to 14 chemical elements in both neutral and charged states, making it a valuable method for modeling the majority of non-metallic compounds. Using an exhaustive dataset of 2 x 107 hybrid DFT level of theory quantum chemical calculations, AIMNet2 combines ML-parameterized short-range and physics-based long-range terms to attain generalizability that reaches from simple organics to diverse molecules with “exotic” element-organic bonding. We show that AIMNet2 outperforms semi-empirical GFN-xTB and is on par with reference density functional theory for interaction energy contributions, conformer search tasks, torsion rotation profiles, and molecular-to-macromolecular geometry optimization.
Works within ASE
New code, overhaul 6/10/24
https://github.com/ACEsuit/mace
MACE provides fast and accurate machine learning interatomic potentials with higher order equivariant message passing.
MACE-MP: Materials Project Force Fields: They have collaborated with the Materials Project (MP) to train a universal MACE potential covering 89 elements on 1.6 M bulk crystals in the MPTrj dataset selected from MP relaxation trajectories. The models are released on GitHub at https://github.com/ACEsuit/mace-mp.
Very active community
Works within ASE
Modern tech stack with PyTorch and JAX
MatGL (Materials Graph Library) is a graph deep learning library for materials science. Mathematical graphs are a natural representation for a collection of atoms. Graph deep learning models have been shown to consistently deliver exceptional performance as surrogate models for the prediction of materials properties.
MatGL is built on the Deep Graph Library (DGL) and PyTorch, with suitable adaptations for materials-specific applications. The goal is for MatGL to serve as an extensible platform to develop and share materials graph deep learning models, including the MatErials 3-body Graph Network (M3GNet) and its predecessor, MEGNet.
Implements:
MEGNet MatErials Graph Network (MEGNet) is an implementation of DeepMind’s graph networks for machine learning in materials science. They have demonstrated its success in achieving low prediction errors in a broad array of properties in both molecules and crystals.
M3GNet Materials 3-body Graph Network (M3GNet) is a new materials graph neural network architecture that incorporates 3-body interactions in MEGNet.
TensorNet, an O(3)-equivariant message-passing neural network architecture that leverages Cartesian tensor representations.
SO3Net, a minimalist SO(3)-equivariant neural network.
This data file is the MPtrj dataset.
The json file contains 1,580,395 structures, 1,580,395 energies, 7,944,833 magnetic moments, 49,295,660 forces, and 14,223,555 stresses that were used to train the pretrained CHGNet
The structures and labels are parsed from all the GGA/GGA+U static/relaxation trajectories from 2022.9 version Materials Project, with selection method that avoids incompatible calculations and duplicated structures.
Used to train CHGNet and MACE foundation models