Abstract

Latent low-rank representation (LatLRR) is a critical self-representation technique that improves low-rank representation (LRR) by using observed and unobserved samples. It can simultaneously learn the low-dimensional structure embedded in the data space and capture the salient features. However, LatLRR ignores the local geometry structure and can be affected by the noise and redundancy in the original data space. To solve the above problems, we propose a latent LRR with weighted distance penalty (LLRRWD) for clustering in this article. First, a weighted distance is proposed to enhance the original Euclidean distance by enlarging the distance among the unconnected samples, which can enhance the discriminitation of the distance among the samples. By leveraging on the weighted distance, a weighted distance penalty is introduced to the LatLRR model to enable the method to preserve both the local geometric information and global information, improving discrimination of the learned affinity matrix. Moreover, a weight matrix is imposed on the sparse error norm to reduce the effect of noise and redundancy. Experimental results based on several benchmark databases show the effectiveness of our method in clustering.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call