Abstract

Generative adversarial network (GAN) has achieved remarkable success in generating high-quality synthetic data by learning the underlying distributions of target data. Recent efforts have been devoted to utilizing optimal transport (OT) to tackle the gradient vanishing and instability issues in GAN. They use the Wasserstein distance as a metric to measure the discrepancy between the generator distribution and the real data distribution. However, most optimal transport GANs define loss functions in Euclidean space, which limits their capability in handling high-order statistics that are of much interest in a variety of practical applications. In this article, we propose a computational framework to alleviate this issue from both theoretical and practical perspectives. Particularly, we generalize the optimal transport-based GAN from Euclidean space to the reproducing kernel Hilbert space (RKHS) and propose Hilbert Optimal Transport GAN (HOT-GAN). First, we design HOT-GAN with a Hilbert embedding that allows the discriminator to tackle more informative and high-order statistics in RKHS. Second, we prove that HOT-GAN has a closed-form kernel reformulation in RKHS that can achieve a tractable objective under the GAN framework. Third, HOT-GAN's objective enjoys the theoretical guarantee of differentiability with respect to generator parameters, which is beneficial to learn powerful generators via adversarial kernel learning. Extensive experiments are conducted, showing that our proposed HOT-GAN consistently outperforms the representative GAN works.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call