Abstract

Spatial regularization based sparse unmixing has attracted much attention in the hyperspectral remote sensing image processing field, which combines spatial information consideration with a sparse unmixing model, and has achieved improved fractional abundance results. However, the traditional spatial sparse unmixing approaches can suppress discrete wrong unmixing points and smooth an abundance map with low-contrast changes, and it has no concept of scale difference. In this paper, to better extract the different levels of spatial details, rolling guidance based scale-aware spatial sparse unmixing (namely, Rolling Guidance Sparse Unmixing (RGSU)) is proposed to extract and recover the different levels of important structures and details in the hyperspectral remote sensing image unmixing procedure, as the different levels of structures and edges in remote sensing imagery have different meanings and importance. Differing from the existing spatial regularization based sparse unmixing approaches, the proposed method considers the different levels of edges by combining a Gaussian filter-like method to realize small-scale structure removal with a joint bilateral filtering process to account for the spatial domain and range domain correlations. The proposed method is based on rolling guidance spatial regularization in a traditional spatial regularization sparse unmixing framework, and it accomplishes scale-aware sparse unmixing. The experimental results obtained with both simulated and real hyperspectral images show that the proposed method achieves visual effects better and produces higher quantitative results (i.e., higher SRE values) when compared to the current state-of-the-art sparse unmixing algorithms, which illustrates the effectiveness of the rolling guidance based scale aware method. In the future work, adaptive scale-aware spatial sparse unmixing framework will be studied and developed to enhance the current idea.

Highlights

  • In the last decade, airborne and satellite hyperspectral remote sensing sensors have developed at an enormous rate, resulting in the availability of a large volume of hyperspectral remote sensing data with a wealth of spectral information and a higher spectral resolution, which covers a wider wavelength region with hundreds of spectral channels at a nominal spectral resolution

  • It can be observed that the spatial regularization based sparse unmixing obtains significantly better results than the classical sparse unmixing method, SUnSAL and SU-NLE

  • Spatial regularization based sparse unmixing approaches can strongly suppress the wrong unmixing abundances, which illustrates the effectiveness of the spatial consideration

Read more

Summary

Introduction

Airborne and satellite hyperspectral remote sensing sensors have developed at an enormous rate, resulting in the availability of a large volume of hyperspectral remote sensing data with a wealth of spectral information and a higher spectral resolution, which covers a wider wavelength region with hundreds of spectral channels at a nominal spectral resolution. The resulting hyperspectral date cube enables precise material identification with the abundance spectral information, as each pixel can be represented by a spectral signature or fingerprint that characterizes the underling objects [1, 2]. Spectral unmixing is a common way to solve this mixed pixel problem, and it is aimed at estimating the fractional abundances of the pure spectral signatures or endmembers in each mixed pixel with linear or nonlinear mixture models [6,7]. When compared with the nonlinear mixture model, the linear mixture model has been extensively studied as a result of its computational tractability and its flexibility in different applications, and the fact that it holds in macroscopic remote sensing scenarios. In this paper, we focus on linear spectral unmixing analysis

Methods
Results
Discussion
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.