Abstract

Low rank representation is capable of capturing the global structure of mixed subspaces which are usually assumed to be independent. However, its computation is time-consuming. In practice, the data always distributes on subspaces that intersect or even overlap with each other. So the local structure of the data among the overlapping parts is important. Sparsity is a good property to accelerate the algorithm and capture the local linear structure. The drawback is that it breakups the low rank property of the reconstruction coefficient matrix when combining with low rank representation. In order to combine the two advantages properly, in this paper, we introduce a new constraint to the low rank representation matrix, which is called sparse congruency. Fortunately, we find that this new constraint is a simplification to the low rank and sparse constraints. Several experiments are implemented to demonstrate the efficiency of our method in semi-supervised classification.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call