Abstract

In remote sensing images, domain adaptation (DA) deals with the regions where labeling information is unknown. Typically, hand-driven features for learning a common distribution among known and unknown regions have been extensively exploited to perform the classification task in hyperspectral images with the aid of state-of-the-art machine learning algorithms. Under limited training samples and using hand-crafted features, the classification performance degrades significantly. To overcome the engineered feature extraction process, an automatic feature extraction scheme can be seen useful to generate more complex but useful features for classification. Deep-learning-based architectures have been found to be pivotal on this regard. Deep learning algorithms are effectively used in hyperspectral domain to solve the DA problem. However, attention-based activation mappings, which are very successful for distinguishing different classes of images via transferring relevant mappings from a deep-to-shallow network is not widely explored in DA domain. In this article, we have opted to use attention-based DA through transferring different levels of attentions by means of different types of activation mappings from a deep residual teacher network to a shallow residual student network. Our goal is to provide useful but more complex features to the shallow student network for improving the overall classification in case of DA task. It has been shown that for different kinds of activation mappings, the proposed attention-based transfer improves the performance of the shallow network for the DA problem. It also outperforms the state-of-the-art DA methods based on traditional machine learning and deep learning paradigms.

Highlights

  • Domain adaptation (DA) in hyperspectral images (HSI) helps to deduce the labels for the data where labelling information is unavailable

  • If a small number of labels in the target region are available, DA can be seen as a supervised problem where the distribution about the unknown pixels in the target regions can be learned from the source region

  • Motivated by the recent success of the attention-based transfer learning for object classification [12], we proposed a deep learning based attention transfer architecture in [13] for DA to classify hyperspectral imagery

Read more

Summary

Introduction

Domain adaptation (DA) in hyperspectral images (HSI) helps to deduce the labels for the data where labelling information is unavailable. Classification problems in DA could be different, the main idea still remains the same where pixel distributions of the source region are needed to be matched to that of the target region for correct estimation of labels in the target region By doing this genre of matching, the knowledge from one region is transferred to another to find the desired hidden features in a given image. In [15], few informative labeled examples from the distribution to be labeled have been selected via actively determining the set of most informative pixels to be labeled from a set of candidates in the target domain with the help of instance weights and SVM as classifier Another instance of this type of DA method iteratively reweights the source data for learning a common space between the heterogeneous source and target domain [16]. The kernel instance DA method in [17] uses multiple kernels’ weights in an adaptive way to train from the available labeled samples in the source domain and adds minimum number of most informative and active samples to label the pixels in the target domain

Objectives
Methods
Findings
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call