Abstract

Domain adaption is definitely critical for success in bridging source and target domains that data distribution shifts exist in domain or task. The state-of-the-art of the adversarial feature learning model named Bidirectional Generative Adversarial Networks (BiGAN), forces generative models to align with an arbitrarily complex distribution in a latent space. However, BiGAN only matches single data distribution without exploiting multi-domain structure, which means the learned latent representation could not transfer to related target domains. Recent research has proved that GANs combined with Cycle Consistent Constraints are effective at image translation. Therefore, we propose a novel framework named Transferable Bidirectional Generative Adversarial Networks combining with Cycle-Consistent Constraints (Cycle-TBiGAN) be applied in cross-domain translation, which aims at learning an alignment latent feature representation and achieving a mapping function between domains. Our framework is suitable for a wide variety of domain adaption scenarios. We show the surprising results in the task of image translation without prior ground-truth knowledge. Extensive experiments are presented on several public datasets. Quantitative comparisons demonstrate the superiority of our approach against previous methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call