The effectiveness of adversarial generative networks (GANs) has been shown in many works. However, the convergence of the GANs should be granted mathematically with proper hyperparameters and some network structures. Considering the difficulties of constructing a large and clean dataset, GANs are required in many cases to extract enough information from a relatively small dataset. This could result in bad coverage performance of GANs, and one way to tackle this problem is pretraining. Since this method relies on importing external information from the pretraining dataset, this paper does a necessary discussion on the effect that the pretraining dataset has on the model and makes a simple assumption on the standard of choosing a pretraining dataset. Specifically, this paper focuses on the case where the target is to generate animation-style characters, and it has been proven that when providing a pretraining dataset with higher diversity, the performance of the GAN is better. The result of this paper should reveal the possibility that using a specially chosen pretraining dataset can improve the result of GAN.