Abstract

The total variation (TV) regularized low-rank models have emerged as a powerful tool for hyperspectral image (HSI) denoising. TV, defined by the ℓ1-norm of gradients, is assumed that gradients obey the Laplacian distribution from the statistics point of view. By investigating the histogram of HSI’s gradients, we find that gradients in real HSIs are actually distributed as the hyper-Laplacian distribution with the power parameter q=1/2. Taking this prior into account, a hyper-Laplacian spectral-spatial total variation (HTV), defined by the ℓ1/2-norm of gradients, is proposed for HSI denoising. Furthermore, by incorporating HTV as the regularizer, a low-rank matrix model and a low-rank tensor model are proposed. The two models can be solved by the augmented Lagrange multiplier algorithm. To validate the effectiveness of HTV, we formulate baseline models by replacing HTV with ℓ1-norm and ℓ0-norm based TV regularizations, and it is revealed that our proposed HTV outperforms them. Furthermore, compared with several popular HSI denoising algorithms, the experiments conducted on both the simulated and real data demonstrate the superiority of proposed models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call