Abstract

Hyperspectral anomaly detection (HAD) is a challenging task because it explores the intrinsic structure of complex high-dimensional signals without any samples at training time. Deep neural networks (DNNs) can dig out the underlying distribution of hyperspectral data but are limited by the labeling of large-scale hyperspectral datasets, especially the low spatial resolution of hyperspectral data, which makes labeling more difficult. To tackle this problem while ensuring the detection performance, we present an unsupervised low-rank embedded network (LREN) in this paper. LREN is a joint learning network in which the latent representation is specifically designed for HAD, rather than merely as a feature input for the detector. And it searches the lowest rank representation based on a representative and discriminative dictionary in the deep latent space to estimate the residual efficiently. Considering the physically mixing properties in hyperspectral imaging, we develop a trainable density estimation module based on Gaussian mixture model (GMM) in the deep latent space to construct a dictionary that can better characterize the complex hyperspectral images (HSIs). The closed-form solution of the proposed low-rank learner surpasses existing approaches on four real hyperspectral datasets with different anomalies. We argue that this unified framework paves a novel way to combine feature extraction and anomaly estimation-based methods for HAD, which intends to learn the underlying representation tailored for HAD without the prerequisite of manually labeled data. Code available at https://github.com/xdjiangkai/LREN.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.