Abstract

We present a method to model interatomic interactions such as energy and forces in a computationally efficient way. The proposed model approximates the energy/forces using a linear combination of random features, thereby enabling fast parameter estimation by solving a linear least-squares problem. We discuss how random features based on stationary and non-stationary kernels can be used for energy approximation and provide results for three classes of materials, namely two-dimensional materials, metals and semiconductors. Force and energy predictions made using the proposed method are in close agreement with density functional theory calculations, with training time that is 96% lower than standard kernel models. Molecular Dynamics calculations using random features based interatomic potentials are shown to agree well with experimental and density functional theory values. Phonon frequencies as computed by random features based interatomic potentials are within 0.1% of the density functional theory results. Furthermore, the proposed random features-based potential addresses scalability issues encountered in this class of machine learning problems.

Highlights

  • Classical molecular dynamics (MD) based on empirical force fields has emerged as a powerful technique for predicting material properties and behaviour at the atomistic scale

  • This is because the use of empirical models of interatomic interactions enable significantly faster and scalable computations compared to ab initio methods such as density functional theory (DFT)[1]

  • In order to address the high computational cost associated with the application of machine learning (ML) to learn IPs, this study presents a method that approximates the local atomic energy as a linear combination of random features associated with a kernel

Read more

Summary

INTRODUCTION

Classical molecular dynamics (MD) based on empirical force fields has emerged as a powerful technique for predicting material properties and behaviour at the atomistic scale. In order to address the high computational cost associated with the application of ML to learn IPs, this study presents a method that approximates the local atomic energy as a linear combination of random features associated with a kernel. Apart from providing better energy/force fittings, IPs by Rahimi and Recht[39], the kernel can be approximated in the based on random features significantly reduce the training time form of Eq 2, with by 96% as compared to GAP. The main reason for this is the smaller number of parameters needed to fit the energy function of random features based IPs. The low-dimensional parameter space has a direct impact on the computational cost associated with energy/force evaluations during MD run time, with fewer rffiffiffiffi zðqÞ 1⁄4 M2 1⁄2cosðωT1qÞ; sinðωT1qÞ; Á Á Á ; cosðωTMqÞ; sinðωTMqފ 2 R2M;. The distribution p(ω) takes the specific forms as detailed by Rahimi and Recht[39]

RESULTS
Dhaliwal et al 4
DISCUSSION
8: Output z p1ffiffiffi
CODE AVAILABILITY
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.