Abstract

Deep learning classifiers exhibit remarkable performance for hyperspectral image classification given sufficient labeled samples but show deficiency in the situation of learning with limited labeled samples. Active learning endows deep learning classifiers with the ability to alleviate this deficiency. However, existing active deep learning methods tend to underestimate the feature variability of hyperspectral images when querying informative unlabeled samples subject to certain acquisition heuristics. A major reason for this bias is that the acquisition heuristics are normally derived based on the output of a deep learning classifier, in which representational power is bounded by the number of labeled training samples at hand. To address this limitation, we developed a feature-oriented adversarial active learning (FAAL) strategy, which exploits the high-level features from one intermediate layer of a deep learning classifier for establishing an acquisition heuristic based on a generative adversarial network (GAN). Specifically, we developed a feature generator for generating fake high-level features and a feature discriminator for discriminating between the real high-level features and the fake ones. Trained with both the real and the fake high-level features, the feature discriminator comprehensively captures the feature variability of hyperspectral images and yields a powerful and generalized discriminative capability. We leverage the well-trained feature discriminator as the acquisition heuristic to measure the informativeness of unlabeled samples. Experimental results validate the effectiveness of both (i) the full FAAL framework and (ii) the adversarially learned acquisition heuristic, for the task of classifying hyperspectral images with limited labeled samples.

Highlights

  • Hyperspectral imaging is characterized by simultaneously capturing the radiance of the earth’s surface at several hundreds of contiguous wavelength bands [1]

  • Our feature-oriented adversarial active learning (FAAL) framework achieves state-of-the-art performance on two public hyperspectral image datasets for classifying hyperspectral images with limited labeled samples

  • We rigorously evaluated our FAAL framework for the task of classifying hyperspectral images with limited labeled samples

Read more

Summary

Introduction

Hyperspectral imaging is characterized by simultaneously capturing the radiance of the earth’s surface at several hundreds of contiguous wavelength bands [1]. The above algorithms query unlabeled samples by evaluating the uncertainty based on the output of classifiers, in which representational power is bounded by the number of labeled samples at hand In this scenario, these active deep learning methods tend to underestimate the feature variability of hyperspectral images spanning from spectral domain to spatial domain. We develop an active deep learning framework, referred to as feature-oriented adversarial active learning (FAAL), for classifying hyperspectral images with limited labeled samples. The active learning within our FAAL framework is characterized by an acquisition heuristic which is established via high-level feature-oriented adversarial learning Such exploration enables our FAAL framework to comprehensively capture the feature variability of hyperspectral images and yield an effective hyperspectral image classification scheme.

Active Learning
Generative Adversarial Networks
Feature-Oriented Adversarial Active Learning
High-level Features from Classifier Division
Adversarial Learning with High-Level Features
Active Query of Unlabeled Samples
Workflow of Full Framework
2: Update classifier initially
Experimental Results and Discussion
Datasets
Implementation Details
Analysis of the Naive Classifier
Comparison with Other Active Learning Classifiers
Study on Acquisition Heuristics
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.