Abstract

Efficient prediction of sampling-intensive thermodynamic properties is needed to evaluate material performance and permit high-throughput materials modeling for a diverse array of technology applications. To alleviate the prohibitive computational expense of high-throughput configurational sampling with density functional theory (DFT), surrogate modeling strategies like cluster expansion are many orders of magnitude more efficient but can be difficult to construct in systems with high compositional complexity. We therefore employ minimal-complexity graph neural network models that accurately predict and can even extrapolate to out-of-train distribution formation energies of DFT-relaxed structures from an ideal (unrelaxed) crystallographic representation. This enables the large-scale sampling necessary for various thermodynamic property predictions that may otherwise be intractable and can be achieved with small training data sets. Two exemplars, optimizing the thermodynamic stability of low-density high-entropy alloys and modulating the plateau pressure of hydrogen in metal alloys, demonstrate the power of this approach, which can be extended to a variety of materials discovery and modeling problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call