Abstract

Hyperspectral unmixing is a hot topic in signal and image processing. A high-dimensional data can be decomposed into two non-negative low-dimensional matrices by Non-negative matrix factorization(NMF). However, the algorithm has many local solutions because of the non-convexity of the objective function. Some algorithms solve this problem by adding auxiliary constraints, such as sparse. The sparse NMF has good performance but the result is unstable and sensitive to noise. Using the structural information for the unmixing approaches can make the decomposition stable. Someone used a clustering based on Euclidean distance to guide the decomposition and obtain good performance. The Euclidean distance is just used to measure the straight line distance of two points, and the ground objects usually obey certain statistical distribution. It's difficult to measure the difference between the statistical distributions comprehensively by Euclidean distance. KL divergence is a better metric. In this paper, we propose a new approach named KL divergence constrained NMF which measures the statistical distribution difference using KL divergence instead of the Euclidean distance. It can improve the accuracy of structured information by using the KL divergence in the algorithm. Experimental results based on synthetic and real hyperspectral data show the superiority of the proposed algorithm with respect to other state-of-the-art algorithms.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.