Abstract
Generative Adversarial Networks (GANs) are capable of effectively synthesising new realistic images and estimating the potential distribution of samples utilising adversarial learning. Nevertheless, conventional GANs require a large amount of training data samples to produce plausible results. Inspired by the capacity for humans to quickly learn new concepts from a small number of examples, several meta-learning approaches for the few-shot datasets are presented. However, most of meta-learning algorithms are designed to tackle few-shot classification and reinforcement learning tasks. Moreover, the existing meta-learning models for image generation are complex, thereby affecting the length of training time required. Fast Adaptive Meta-Learning (FAML) based on GAN and the encoder network is proposed in this study for few-shot image generation. This model demonstrates the capability to generate new realistic images from previously unseen target classes with only a small number of examples required. With 10 times faster convergence, FAML requires only one-fourth of the trainable parameters in comparison baseline models by training a simpler network with conditional feature vectors from the encoder, while increasing the number of generator iterations. The visualisation results are demonstrated in the paper. This model is able to improve few-shot image generation with the lowest FID score, highest IS, and comparable LPIPS to MNIST, Omniglot, VGG-Faces, and <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">mini</i> ImageNet datasets. The source code is available on <uri xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">https://github.com/phaphuang/FAML</uri> .
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.