Abstract

Given recent advances in deep learning, semi-supervised techniques have seen a rise in interest. Generative adversarial networks (GANs) represent one recent approach to semi-supervised learning (SSL). This paper presents a survey method using GANs for SSL. Previous work in applying GANs to SSL are classified into pseudo-labeling/classification, encoder-based, TripleGAN-based, two GAN, manifold regularization, and stacked discriminator approaches. A quantitative and qualitative analysis of the various approaches is presented. The R3-CGAN architecture is identified as the GAN architecture with state-of-the-art results. Given the recent success of non-GAN-based approaches for SSL, future research opportunities involving the adaptation of elements of SSL into GAN-based implementations are also identified.

Highlights

  • With recent advances in deep learning and its applications, research opportunities in the area have expanded and diversified in different directions

  • Generative adversarial networks (GANs) were effective at taking a latent and generating authors were effective at taking a latent spacespace and generating data, data, was thereno was no technique for GANs to project the data the latent space

  • An interesting technique was proposed by SVMGAN [44], that tried to solve the issue of GAN-based supervised learning (SSL) models being sensitive to local perturbations by introducing a was used as part of the attention-based GAN architecture, while manifold regularization based on [42] was added as an additional regularization term to the loss function to make full use of unlabeled samples using a Monte Carlo approximation

Read more

Summary

Introduction

With recent advances in deep learning and its applications, research opportunities in the area have expanded and diversified in different directions. Semi-supervised learning relies on the assumption that the data distribution over the input space embeds significant information about the distribution of the labels in the output space [1]. Points can be on connected byofshort that dothe notdecision pass through low-density [4] The basis this curves assumption, boundary should regions not cross. Thesecond secondassumption, assumption,called calledthe thelow-density low-densityassumption, assumption,states statesthat thatthe thedecision decision boundaryininaaclassifier classifier should pass inin thethe input space [1].[1]. This assumption area in the input space where the probability of a data point existing is low

Common Techniques Used in Semi-Supervised Learning
Consistency Regularization
Pseudo-Labeling
Entropy Minimization
Taxonomy
Extensions Using Pseudo-Labeling and Classifiers
Encoder-Based
The TripleGAN Approach
Methods
Results
Khalvati
Quantitative Analysis
Qualitative Analysis
Examples
Future Directions
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call