Abstract

Hyperspectral Anomaly Detection (HAD) aims to detect the pixel or target whose spectral characteristics are significantly different from the surrounding pixels or targets. The effectiveness of reconstructing the background model is an essential element affecting the improvement of the HAD performance. This paper proposes a Hyperspectral Anomaly Detection method based on Attention-aware Spectral Difference Representation (HAD-ASDR) to reconstruct more accurate background models by using the generated noise distribution matchable to the background as input. The proposed HAD-ASDR mainly includes three modules: Attention-aware Spectral Difference Representation Module (ASDRM), Convolutional Auto-Encoder based Background Reconstruction Module (CAE-BRM) and Joint Spectrum Intensity and Angle based Anomaly Detection Module (JSIA-ADM). First, inspired by Generative Adversarial Network (GAN), ASDRM is proposed to generate a noise distribution that better matches the background by the attention mechanism and the different operation. Then, CAE-BRM is employed to reconstruct the accurate background using the generated noise distribution as input and the convolutional auto-encoder with skip connections. Finally, JSIA-ADM is presented to detect anomalies more accurately by calculating the reconstructed errors from both spectral intensity and angle perspectives. The proposed HAD-ASDR has been verified on five data sets and achieves better or comparable HAD results compared to six other comparison methods. The average AUC of HAD-ASDR on these five data sets is 0.9817 higher than that of the comparison methods, resulting in an improvement of 0.0253. The experimental results demonstrate its superior performance and stability.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.