Abstract

A universal interatomic potential for an arbitrary set of chemical elements is urgently needed in computational materials science. Graph convolution neural network (GCN) has rich expressive power, but previously was mainly employed to transport scalars and vectors, not rank ≥2 tensors. As classic interatomic potentials were inspired by tight-binding electronic relaxation framework, we want to represent this iterative propagation of rank ≥2 tensor information by GCN. Here we propose an architecture called the tensor embedded atom network (TeaNet) where angular interaction is translated into graph convolution through the incorporation of Euclidean tensors, vectors and scalars. By applying the residual network (ResNet) architecture and training with recurrent GCN weights initialization, a much deeper (16 layers) GCN was constructed, whose flow is similar to an iterative electronic relaxation. Our training dataset is generated by density functional theory calculation of mostly chemically and structurally randomized configurations. We demonstrate that arbitrary structures and reactions involving the first 18 elements on the periodic table (H to Ar) can be realized satisfactorily by TeaNet, including C–H molecular structures, metals, amorphous SiO2, and water, showing surprisingly good performance (energy mean absolute error 19 meV/atom) and robustness for arbitrary chemistries involving elements from H to Ar.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.