Abstract

Recently, block-term decomposition with rank-( <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">L<sub>r</sub></i> , <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">L<sub>r</sub></i> ,1) (termed as LL1 decomposition), which is physically inspired by linear spectral unmixing, has received increasing attention in hyperspectral images (HSIs) denoising. However, due to the intrinsic nonlinear structure of real-world HSIs, the low-rankness of HSIs is usually implicit. Moreover, the essential uniqueness guarantee is usually violated with the low-rank assumption of the abundance maps unsupported in real scenarios, which hampers the successful deployment of LL1 decomposition. Inspired by the nonlinear spectral unmixing, we propose a nonlinear learnable transform-based LL1 decomposition (NT-LL1) for characterizing the implicit low-rank structure of real-world HSIs. More concretely, the nonlinear learnable transform in NT-LL1 decomposition is a composed transform consisting of a linear semi-orthogonal transform and a component-wise nonlinear transform, which collaboratively enhances the low-rankness of the abundance maps. Empowering with the NT-LL1 decomposition, we propose an NT-LL1 decomposition-based model for HSIs denoising. To tackle the resulting model, we develop an efficient proximal alternating minimization-based algorithm with a convergence guarantee. Extensive experimental results including simulated and real data collectively verify the superiority of the proposed method as compared with the competing methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call