Multiple kernel clustering (MKC) aims to learn an optimal kernel to better serve for clustering from several precomputed basic kernels. Most MKC algorithms adhere to a common assumption that an optimal kernel is linearly combined by basic kernels. Based on a min-max framework, a newly proposed MKC method termed simple multiple kernel k -means (SimpleMKKM) can acquire a high-quality unified kernel. Although SimpleMKKM has achieved promising clustering performance, we observe that it cannot benefit from any prior knowledge. This would cause the learned partition matrix may seriously deviate from the expected one, especially in clustering tasks where the ground truth is absent during the learning course. To tackle this issue, we propose a novel algorithm termed regularized simple multiple kernel k -means with kernel average alignment (R-SMKKM-KAA). According to the experimental results of existing MKC algorithms, the average partition is a strong baseline to reflect true clustering. To gain knowledge from the average partition, we add the average alignment as a regularization term to prevent the learned unified partition from being far from the average partition. After that, we have designed an efficient solving algorithm to optimize the new resulting problem. In this way, both the incorporated prior knowledge and the combination of basic kernels are helpful to learn better unified partition. Consequently, the clustering performance can be significantly improved. Extensive experiments on nine common datasets have sufficiently demonstrated the effectiveness of incorporation of prior knowledge into SimpleMKKM.