MatGL
Graph deep learning is transforming materials science by enabling accurate, scalable, and efficient predictions of material properties and potential energy surfaces (PES). Our article on “Materials Graph Library (MatGL) — an open-source, modular framework purpose-built for materials science and chemistry” has been published in npj Computational Materials. MatGL started as a collaboration between Intel Labs and the Materials Virtual Lab to provide a “batteries-included” environment for:
- Implementing both invariant and equivariant graph neural networks (GNNs), including M3GNet, MEGNet, CHGNet, TensorNet, and SO3Net.
- Leveraging pretrained foundation potentials (FPs) covering the full periodic table for out-of-the-box property predictions and atomistic simulations.
- Seamless integration with ASE and LAMMPS for high-fidelity molecular dynamics, geometry optimization, and property calculations.
- Training custom models with PyTorch Lightning for efficient parallelization on CPUs, GPUs, and TPUs.
We show that MatGL’s models achieving state-of-the-art accuracy on widely used datasets (QM9, Matbench, ANI-1x, MPF, MatPES) while maintaining competitive computational efficiency. The library’s design also supports fine-tuning, enabling rapid adaptation to new materials systems.
Read the full paper here and explore MatGL on GitHub.