Abstract

Convolutional neural networks (CNNs) have been shown to be a powerful tool for image classification. Recently, they have been adopted into the remote sensing community with applications in material classification from hyperspectral images. However, CNNs are time-consuming to train and often require large amounts of labeled training data. The widespread use of CNNs in the image processing and computer vision communities has been facilitated by the networks that have already been trained on large amounts of data. These pretrained networks can be used to initialize networks for new tasks. This transfer of knowledge makes it far less time-consuming to train a new classifier and reduces the need for a large labeled data set. This concept of transfer learning has not yet been fully explored by those using CNNs to train material classifiers from hyperspectral data. This paper provides an insight into training hyperspectral CNN classifiers by transferring knowledge from well labeled data sets to data sets that are less well labeled. It is shown that these CNNs can transfer between completely different domains and sensing platforms, and still improve classification performance. The application of this work is in the training of material classifiers of data acquired from field-based platforms, by transferring knowledge from publicly accessible airborne data sets. Factors, such as training set size, CNN architectures, and the impact of filter width and wavelength interval, are studied.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.