Abstract

As for Generative Adversarial Networks (GANs), its interpretability may be closely related to optimization objective functions, that is, information metrics play important roles in networks training and data generation. In terms of original GAN, the objective function based on Kullback-Leibler (KL) divergence has limitations on the performance of data training and generation. Therefore, it is significant to investigate objective functions for the optimization in GANs to bring gains on the efficiency of network learning from the perspective of metrics. In this paper, the objective function with exponential form, referred from the Message Importance Measure (MIM), is adapted to replace that with logarithm form in the optimization for adversarial networks. This approach named MIM-based GAN, may provide more hidden information in terms of interpretability on training process and probability events generation. Specifically, we first analyze the intrinsic relationship between the proposed approach and other classical GANs. Moreover, compared with the original GAN, LSGAN and WGAN, we discuss its advantages on training performance in theory including sensitivity, convergence rate and so on. In addition, we do simulations on the datasets to confirm why the MIM-based GAN achieves state-of-the-art performance on training process and data generation.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.