Abstract

We apply on-the-fly machine learning potentials (MLPs) using the sparse Gaussian process regression (SGPR) algorithm for fast optimization of atomic structures. Great acceleration is achieved even in the context of a single local optimization. Although for finding the exact local minimum, due to limited accuracy of MLPs, switching to another algorithm may be needed. For random gold clusters, the forces are reduced to ∼0.1 eV Å−1 within less than ten first-principles (FP) calculations. Because of highly transferable MLPs, this algorithm is specially suitable for global optimization methods such as random or evolutionary structure searching or basin hopping. This is demonstrated by sequential optimization of random gold clusters for which, after only a few optimizations, FP calculations were rarely needed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call