AtomSets – using graph networks as an encoder

AtomSets

Graph networks are an extremely powerful deep learning tool for predicting materials properties. However, a critical weakness is their reliance on large quantities of training data. In this work published in npj Computational Materials, Dr Chi Chen shows that pre-trained MEGNet formation energy models can be effectively used as “encoders” for crystals in what we call the AtomSets framework. The compositional and structural descriptors extracted from graph network deep learning models, combined with standard artificial neural network models, can achieve lower errors than the graph network models at small data limits and other non-deep-learning models at large data limits. AtomSets also transfer better in a simulated materials discovery process where the targeted materials have property values out of the training data limits, require minimal domain knowledge inputs and are free from feature engineering. Check out this work here.

Using ML to Discover New Materials

BOWSR

Our paper on “Accelerating materials discovery with Bayesian optimization and graph deep learning” has just been published in Materials Today! In our group, we are firm advocates of ML models that utilize structure-based features, because only such models can reliably predict property differences between chemically similar but structurally different materials (e.g., diamond vs graphite). However, a bottleneck remains in that obtaining an input structure today still depends on expensive DFT calculations. Here, we show that Bayesian optimization with an accurate MEGNet energy model can be used to obtain sufficiently good input structures for ML model predictions. We demonstrated the power of this approach by screening 400,000 materials for ultra-incompressibility. Two completely novel materials are realized experimentally by Mingde Qin in Prof Jian Luo’s group at UCSD. This work paves the way to ML-accelerated discovery of new materials with exceptional properties. Check out this work here.