Abstract

Multispectral image (MSI) destriping is a challenging topic and has been attracting much research attention in remote sensing area due to its importance in improving the image qualities and subsequent applications. The existing destriping methods mainly focus on matrix-based modeling representation, which fails to fully discover the correlation of the stripe component in both spatial dimensions. In this paper, we propose a novel low-rank tensor decomposition framework based MSI destriping method by decomposing the striped image into the image component and stripe component. Specifically, for the image component, we use the anisotropic spatial unidirectional total variation (TV) and spectral TV regularization to enhance the piecewise smoothness in both spatial and spectral domains. Moreover, for the stripe component, we adopt tensor Tucker decomposition and $\ell _{2,1}$ -norm regularization to model the spatial correlation and group sparsity characteristic among all bands, respectively. An efficient algorithm using the augmented Lagrange multiplier method is designed to solve the proposed optimization model. Experiments under various cases of simulated data and real-world data demonstrate the effectiveness of the proposed model over the existing single-band and MSI destriping methods in terms of the qualitative and quantitative.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.