Abstract

The subject of this paper is the technology (the ‘how’) of constructing machine-learning interatomic potentials, rather than science (the ‘what’ and ‘why’) of atomistic simulations using machine-learning potentials. Namely, we illustrate how to construct moment tensor potentials using active learning as implemented in the MLIP package, focusing on the efficient ways to automatically sample configurations for the training set, how expanding the training set changes the error of predictions, how to set up ab initio calculations in a cost-effective manner, etc. The MLIP package (short for Machine-Learning Interatomic Potentials) is available at https://mlip.skoltech.ru/download/.

Highlights

  • Machine-learning interatomic potentials have recently been a subject of research and they are turning into a tool of research

  • In this manuscript we focus on the methodology of applying moment tensor potentials (MTP) and active learning to performing atomistic simulations

  • We introduce the notion of extrapolation grade γ(cfg)—a feature of a configuration that correlates with the prediction error but does not require ab initio information to be calculated prior to its evaluation; its precise definition is given below

Read more

Summary

Introduction

Machine-learning interatomic potentials have recently been a subject of research and they are turning into a tool of research. The concept of machine-learning potentials was pioneered in 2007 by Behler and Parrinello [5], who, motivated by the success of approximating potential energy surfaces of small molecules with neural networks, proposed the use of neural networks as a functional form of interatomic potentials, going beyond the small molecular systems by employing the locality of interaction. MTPs, like SOAP-GAP, are not based on solely two- and three-body descriptors and can provably approximate an arbitrary local interaction, and so are ACE [18] and PIP [75] Because of this MTP together with GAP showed excellent accuracy in recent cheminformatics benchmark test [54] and interatomic potential test [82]; in the latter MTP showed a very good balance between accuracy and computational efficiency when compared against other machine-learning potentials. The description of and references to the MLIP code is kept to minimum, yet retained for the purpose of easier preproduction of the described results

Moment Tensor Potential
Training on a quantum-mechanical database
Active learning: query strategy
Active learning bootstrapping iterations
Result
MLIP Package
Example 1
Training and Validation of MTP
Example 2
Active learning iterations as implemented in MLIP
Stage 1
Example 3
Findings
Concluding Remarks
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call