Abstract

In hyperspectral image classification, a large number of labeled samples are necessary for deep network training. However, labeling of hyperspectral images is tedious, difficult, and time-consuming work. In this letter, a new active learning (AL) framework for deep networks is proposed. An auxiliary deep network for the basic learner is constructed to learn the uncertainty of unlabeled samples in the candidate data set. Both the features of the original training data and the features of the middle hidden layer of the basic learner are gathered into a fully connected network with a newly defined loss function. In order to avoid the problem of an insufficient number of samples or a large amount of computation in AL, data sampling is performed on a candidate data set, and augmentation is conducted on newly selected samples. The proposed model is evaluated on different data sets and compared with other related methods. The results show that the proposed model performs better on different data sets than other methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call