Mode collapse has been a persisting challenge in generative adversarial networks (GANs), and it directly affects the applications of GAN in many domains. Existing works that attempt to solve this problem have some serious limitations: models using optimal transport (OT) strategies (e.g., Wasserstein distance) lead to vanishing or exploding gradients; increasing the number of generators can cause several generators focusing on the same mode; and approaches that modify the loss also do not satisfactorily resolve mode collapse. In this article, we reduce mode collapse by formulating it as a Monge problem of OT map. We show that the Monge problem can be transformed to the distribution transformation problem in GAN, and a rectified affine neural network can be considered as a measurable function. In this way, we propose Monge GAN that uses this measurable function to transform the generated data distribution into the original data distribution. We utilize the Kantorovich formulation to obtain the OT cost, which is regarded as the OT distance between the two distributions. Finally, we conduct extensive experiments on both image and numerical datasets to validate our Monge GAN in reducing model collapse.