Abstract

Hyperspectral images (HSIs) with abundant spectral information are generally susceptible to various types of noise, such as Gaussian noise and stripe noise. Recently, a few quality-based selection algorithms have been proposed to remove noise bands from HSIs. However, these methods suffer from an inability to discriminate the mixed-noise bands of HSIs and are sensitive to image content variation and luminance changes. Here, we develop a mixed-noise band selection framework that can separate the Gaussian and stripe noise bands from HSIs effectively. We first improve tensor decomposition to reconstruct the mixed-noise components and low-rank components, which reduces the influence of image content and luminance changes. Spectral smoothness constraints and unidirectional total variation are incorporated into the decomposition model to enhance the separation performance for Gaussian and stripe noise. Then, different statistical features, including Weibull and histogram of oriented gradient (HOG) features, are applied to extract the robust parameters from mixed-noise components. More importantly, an extreme learning machine (ELM) is trained to predict the noise bands. The ELM has an extremely fast learning speed and tends to achieve better performance than other networks. Finally, by aggregating all these strategies, our methods can select the mixed-noise bands efficiently. The experimental results on both synthetic and real noise HSIs indicate that the proposed method outperforms the state-of-the-art methods.

Highlights

  • Hyperspectral images (HSIs) are widely applied in different fields, including hyperspectral classification [1], [2], object recognition, detection [3], [4] and superresolution [5], [6]

  • Some bands of HSIs are inevitably corrupted by various types of noise, degrading the image quality greatly [7], [8], which directly deteriorates the related processing of the HSIs

  • We report the mean peak signalto-noise ratio (MPSNR) and mean structural similarity index (MSSIM) values of the reference images and reconstructed images

Read more

Summary

INTRODUCTION

Hyperspectral images (HSIs) are widely applied in different fields, including hyperspectral classification [1], [2], object recognition, detection [3], [4] and superresolution [5], [6]. In [17], Mittal proposed the blind/referenceless image spatial quality evaluation (BRISQUE) and natural image quality evaluation separately by employing the natural scene statistics (NSS) of local luminance values to quantify the distorted image quality These models are highly sensitive to the image content and illumination variation, and approximation issues arise in the method [18]. Liu et al [22] collected gradient statistics features and employed an AdaBoost neural network to represent the image quality These models are sensitive to luminance variation and cannot deal with stripe noise or be applied to HSI quality assessment directly. In [23], Yang et al designed a spectral-spatial quality metric that extracts the quality-sensitive features from the image structure, texture and spectra This metric estimates HSIs as a whole and cannot be applied to remove mixed-noise bands.

RELATED TENSOR DECOMPOSITION MODEL HS
LOCALLY SMOOTH SPECTRUM CONSTRAINT
UNIDIRECTIONAL TOTAL VARIATION
MULTIPLE STATISTICAL FEATURES OF THE NOISE COMPONENT
EXPERIMENT
SEPARATION RESULTS FOR MIXED-NOISE DATA
SEPARATION RESULTS FOR ACTUAL NOISY DATA
STRIPE NOISE BAND SELECTION USING HOG FEATURES
Findings
CONCLUSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.