Abstract

In this article, we develop a novel fully unsupervised autoencoder-based scheme for nonlinear hyperspectral pixel unmixing. A unique approach is derived where high noise and unresponsive pixels are accounted for, by a unique averaging approach based on spatially aware filters built using radial basis function (RBF) kernels. A novel technique is implemented via calculating rank-equivalent kernel covariance matrices in order to estimate the unknown number of endmembers contributing to the data. Utilization of spatial information is done via RBF-based weighted averaging, which is then followed by endmember estimation via K-means clustering. The RBF distances from the cluster centers are determined to measure the position of the mixed pixels in relation to the centers, which is utilized as a preliminary estimation of the abundances. The proposed framework is robust in the presence of unresponsive pixels, while highly versatile working with different nonlinear unmixing models. Extensive numerical tests establish the superiority of the novel approach with respect to state-of-the-art methods.

Highlights

  • H YPERSPECTRAL images differ from traditional color images since do they incorporate short wave infrared (SWIR), long wave infrared (LWIR), and ultraviolet (UV), and they split the spectrum into hundreds of bands, whereas color images only contain three bands [1]

  • Well-known method for unmixing with the linear mixing model (LMM) is called VCA (Vertex Component Analysis) [3] where, the pixel vectors are represented by points that reside on a subspace in the feature space, and the endmembers are considered vertices of that subspace, whose linear combinations make up the aforementioned points

  • The averaging calculation done here, in conjunction with the averaging performed with K-means as part of the endmember estimation algorithm, as well as the weighted averaging filters implemented in the neural network, are extremely effective in mitigating noise and dead pixels

Read more

Summary

INTRODUCTION

H YPERSPECTRAL images differ from traditional color images since do they incorporate short wave infrared (SWIR), long wave infrared (LWIR), and ultraviolet (UV) (depending on the application), and they split the spectrum into hundreds of bands, whereas color images only contain three bands [1]. One significant aspect of the neural networks-based unmixing schemes discussed, is their inability to account for hyperspectral image data being corrupted by faulty, or malfunctioning cameras where parts of the sensor would be unresponsive Such resultant sensors can contain pixels that gather no information at all, known as dead pixels. Zero values in pixel vectors cause the corresponding points in the feature space to be shifted towards the origin of their corresponding axes which results further inaccuracies Another property that is shared in the methods described above, is their prior knowledge in the number of endmembers present in the data. IV, showing the effectiveness of the proposed novel framework even under challenging scenarios such as extreme noise and the presence of dead pixels

PROBLEM STATEMENT AND PRELIMINARIES
Kernel Data Mappings
Spatial Averaging Filter
Unsupervised Endmember Number Estimation
Autoencoders for Nonlinear Unmixing
Backpropagation
NUMERICAL TESTS
Semi-Synthetic Data
Accuracy in Counting Endmembers
Comparison of Other Methods With Incorrect Endmember Numbers
Real-World Data
Proposed Method
Findings
CONCLUDING REMARKS
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.