Abstract

In recent years, deep learning (DL) has attracted increasing attention in hyperspectral unmixing (HU) applications due to its powerful learning and data fitting ability. The autoencoder (AE) framework, as an unmixing baseline network, achieves good performance in HU by automatically learning low-dimensional embeddings and reconstructing data. Nevertheless, the conventional AE-based architecture, which focuses more on the pixel-level reconstruction loss, tends to lose some significant detailed information of certain materials (e.g., material-related properties) in the reconstruction process. Therefore, inspired by the perception mechanism, we propose a cycle-consistency unmixing network, called CyCU-Net, by learning two cascaded AEs in an end-to-end fashion, to enhance the unmixing performance more effectively. CyCU-Net is capable of reducing the detailed and material-related information loss in the process of reconstruction by relaxing the original pixel-level reconstruction assumption to cycle consistency dominated by the cascaded AEs. More specifically, cycle consistency can be achieved by a newly proposed self-perception loss, which consists of two spectral reconstruction terms and one abundance reconstruction term. By taking advantage of the self-perception loss in the network, the high-level semantic information can be well preserved in the unmixing process. Moreover, we investigate the performance gain of CyCU-Net with extensive ablation studies. Experimental results on one synthetic and three real hyperspectral data sets demonstrate the effectiveness and competitiveness of the proposed CyCU-Net in comparison with several state-of-the-art unmixing algorithms.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.