Global optimization with first-principles energy expressions (GOFEE) is an efficient method for identifying low-energy structures in computationally expensive energy landscapes such as the ones described by density functional theory (DFT), van der Waals enabled DFT, or even methods beyond DFT. GOFEE is an evolutionary algorithm, that in order to explore configuration space creates several candidates in parallel. These are treated approximately using a machine learned surrogate model of energies and forces, trained on the fly, eliminating the need for expensive relaxations using first-principles methods. Eventually, using Bayesian statistics, GOFEE chooses one candidate and treats that at the full first-principles level. In this paper we elaborate on the importance of the use of a Gaussian kernel with two length scales in the Gaussian process regression surrogate model. We further explore the role of the use in GOFEE of the lower confidence bound for relaxation and selection of candidate structures. In addition, we present details of a sampling scheme for obtaining parent structures in the evolution. Using machine learning clustering of the entire pool of low-energy structures ever calculated, and choosing the most stable member from each cluster, the scheme ensures a highly diverse sample of structures that plays the role of a population. The versatility of the GOFEE method is demonstrated by applying it to identify the low-energy structures of gas-phase fullerene-type 24-atom carbon clusters and of dome-shaped 18-atom carbon clusters supported on Ir(111).