Abstract

Most existing active learning studies focus on designing sample selection algorithms. However, several fundamental problems deserve investigation to provide deep insight into active learning. In this article, we conduct an in-depth investigation on active learning for classification from the perspective of model change. We derive a general active learning framework for classification called maximum model change (MMC), which aims at querying the influential examples. The model change is quantified as the difference between the model parameters before and after training with the expanded training set. Inspired by the stochastic gradient update rule, the gradient of the loss with respect to a given candidate example is adopted to approximate the model change. This framework is applied to two popular classifiers: support vector machines and logistic regression. We analyze the convergence property of MMC and theoretically justify it. We explore the connection between MMC and uncertainty-based sampling to provide a uniform view. In addition, we discuss its potential usability to other learning models and show its applicability in a wide range of applications. We validate the MMC strategy on two kinds of benchmark datasets, the UCI repository and ImageNet, and show that it outperforms many state-of-the-art methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.