Gradtrans: Transformer-Based Gradient Guidance for Image Generation

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon

Image generation has been attracting widespread attention in recent years along with the development of generative models. Existing works mostly focus on pursuing high-quality generated samples as a priority. In this work, we introduce a lightweight transformer-based module, called GradTrans, that provides a novel balance on the speed-performance trade-off with generative adversarial networks for image generation. GradTrans effectively leverages the instructive information in the discriminator network to guide the generator network for a higher generation quality at the inference stage without overburdening the cost. Extensive experiments are conducted for unconditional image generation task and style transfer task on diverse datasets, including CIFAR10, STL10 and Horse2Zebra, demonstrating that our proposed GradTrans can surpass different related methods with significantly superior performance, as well as being generalizable with large compatibility to different base models.

Save Icon
Up Arrow
Open/Close
Notes

Save Important notes in documents

Highlight text to save as a note, or write notes directly

You can also access these Documents in Paperpal, our AI writing tool

Powered by our AI Writing Assistant