Abstract

The universal mathematical form of machine-learning potentials (MLPs) shifts the core of development of interatomic potentials to collecting proper training data. Ideally, the training set should encompass diverse local atomic environments but conventional approaches are prone to sampling similar configurations repeatedly, mainly due to the Boltzmann statistics. As such, practitioners handpick a large pool of distinct configurations manually, stretching the development period significantly. To overcome this hurdle, methods are being proposed that automatically generate training data. Herein, we suggest a sampling method optimized for gathering diverse yet relevant configurations semi-automatically. This is achieved by applying the metadynamics with the descriptor for the local atomic environment as a collective variable. As a result, the simulation is automatically steered toward unvisited local environment space such that each atom experiences diverse chemical environments without redundancy. We apply the proposed metadynamics sampling to H:Pt(111), GeTe, and Si systems. Throughout these examples, a small number of metadynamics trajectories can provide reference structures necessary for training high-fidelity MLPs. By proposing a semi-automatic sampling method tuned for MLPs, the present work paves the way to wider applications of MLPs to many challenging applications.

Highlights

  • IntroductionCalculations at much lower costs, atomistic simulations based on machine-learning potentials (MLPs) are being established as a new pillar in computational material science[1]

  • By delivering the accuracy of density-functional theory (DFT)calculations at much lower costs, atomistic simulations based on machine-learning potentials (MLPs) are being established as a new pillar in computational material science[1]

  • Most MLPs utilize the locality of quantum systems and so the computational cost increases linearly with respect to the system size, which is a significant advantage over DFT with a cubic scaling[2]

Read more

Summary

Introduction

Calculations at much lower costs, atomistic simulations based on machine-learning potentials (MLPs) are being established as a new pillar in computational material science[1]. Most MLPs utilize the locality of quantum systems and so the computational cost increases linearly with respect to the system size, which is a significant advantage over DFT with a cubic scaling[2]. Various types of MLPs have been proposed; neural network potential (NNP)[3], Gaussian approximation potential (GAP)[4], moment tensor potential[5], deep tensor neural network[6], and gradient-domain machine learning[7]. Universal mathematical structures of MLPs shift the core of potential development to collecting a proper training set that defines atomic environments wherein the trained MLP is valid. The training set is selected based on crystal-derived structures and their molecular dynamics (MD) trajectories

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call