A new hybrid evolutionary algorithm (EA) for Gaussian mixture model-based clustering is proposed. The EA is a steady-state method that, in each generation, selects two individuals from a population, creates two offspring using either mutation or crossover, and fine-tunes the offspring using the expectation maximization (EM) algorithm. The offspring compete with their parents for survival into the next generation. The approach proposed uses a random swap, which replaces a component mean with a randomly chosen feature vector as a mutation operator. In the crossover operator, a random component is copied from the source mixture into a destination mixture. Copying in crossover favors components of the source mixture located away from the components of the destination mixture. In computational experiments, the approach was compared to a multiple restarts EM algorithm, a random swap EM method, and a state-of-the-art hybrid evolutionary algorithm for Gaussian mixture model learning on one real and 29 synthetic datasets. The results indicate that, given the same computational budget, the proposed method usually learns mixtures with higher log-likelihoods than other benchmarks competing algorithms. The partitions of data obtained by the method correspond best to the original divisions of datasets into classes.