Abstract

In recent years, deep learning has revolutionized the field of computer vision and has achieved state-of-the-art performance in a variety of applications. However, training a robust deep neural network necessitates a large amount of hand-labeled training data, which is time-consuming and labor-intensive to acquire. Active learning and transfer learning are two popular methodologies to address the problem of learning with limited labeled data. Active learning attempts to select the salient and exemplar instances from large amounts of unlabeled data; transfer learning leverages knowledge from a labeled source domain to develop a model for a (related) target domain, where labeled data is scarce. In this paper, we propose a novel active transfer learning algorithm with the objective of learning informative feature representations from a given dataset using a deep convolutional neural network, under the constraint of weak supervision. We formulate a loss function relevant to the research task and exploit the gradient descent algorithm to optimize the loss and train the deep network. To the best of our knowledge, this is the first research effort to propose a task-specific loss function integrating active and transfer learning, with the goal of learning informative feature representations using a deep neural network, under weak human supervision. Our extensive empirical studies on a variety of challenging, real-world applications depict the merit of our framework over competing baselines.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call