Abstract

Recent advances in machine-learning interatomic potentials have enabled the efficient modeling of complex atomistic systems with an accuracy that is comparable to that of conventional quantum-mechanics based methods. At the same time, the construction of new machine-learning potentials can seem a daunting task, as it involves data-science techniques that are not yet common in chemistry and materials science. Here, we provide a tutorial-style overview of strategies and best practices for the construction of artificial neural network (ANN) potentials. We illustrate the most important aspects of (a) data collection, (b) model selection, (c) training and validation, and (d) testing and refinement of ANN potentials on the basis of practical examples. Current research in the areas of active learning and delta learning are also discussed in the context of ANN potentials. This tutorial review aims at equipping computational chemists and materials scientists with the required background knowledge for ANN potential construction and application, with the intention to accelerate the adoption of the method, so that it can facilitate exciting research that would otherwise be challenging with conventional strategies.

Highlights

  • Recent advances in machine-learning interatomic potentials have enabled the efficient modeling of complex atomistic systems with an accuracy that is comparable to that of conventional quantummechanics based methods

  • This tutorial review aims at equipping computational chemists and materials scientists with the required background knowledge for artificial neural network (ANN) potential construction and application, with the intention to accelerate the adoption of the method, so that it can facilitate exciting research that would otherwise be challenging with conventional strategies

  • Owing to the success of early ANN-based Behler-Parrinello ML potential (MLP) [23, 31] and Gaussian process regression (GPR)-based Gaussian approximation potentials (GAP) [24, 32], the number of MLP methods proposed in the literature has been rapidly growing: Examples include MLPs based on GPR and other kernel-based methods, such as kernel ridge regression [33,34,35,36], moment tensor potentials [37, 38], graph-networks using message passing [39,40,41,42,43,44,45,46,47], spectral neighbor analysis potentials (SNAP) [48, 49], and other ANN-based approaches [50,51,52,53]

Read more

Summary

Introduction

Recent advances in machine-learning interatomic potentials have enabled the efficient modeling of complex atomistic systems with an accuracy that is comparable to that of conventional quantummechanics based methods. Owing to the success of early ANN-based Behler-Parrinello MLPs [23, 31] and GPR-based Gaussian approximation potentials (GAP) [24, 32], the number of MLP methods proposed in the literature has been rapidly growing: Examples include MLPs based on GPR and other kernel-based methods, such as kernel ridge regression [33,34,35,36], moment tensor potentials [37, 38], graph-networks using message passing [39,40,41,42,43,44,45,46,47], spectral neighbor analysis potentials (SNAP) [48, 49], and other ANN-based approaches [50,51,52,53] We emphasize that this list is not exhaustive and does not include the various ML methods for atomistic modeling that cannot be considered interatomic potentials [54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70]. The sections on model selection (section III) and training/validation (section IV) are mostly specific to ANN potentials

Methods
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call