Abstract

Generative adversarial networks (GANs) are popular tools for learning the distribution of real samples and generating new ones, where the qualities of the generated images and the degree of preserved variation are two major concerns currently. In view of this, we propose an α-EGAN with energy distance as the loss function of the generator, which is proven to be effective in mitigating mode collapse. Moreover, an early stopping rule is proposed in the frame of hypothesis testing to avoid the unhealthy competition between the generator and the discriminator, thus achieving a trade-off between image qualities and variations. As a byproduct, the energy distance under the Euclidean norm can serve as a novel metric for evaluating generated samples in GAN. Experiments are conducted on simulated manifold datasets, as well as real MNIST and CelebA face datasets, showing that the proposed α-EGAN outperforms several competitors in both training stability and image quality.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.