Abstract

The accuracy of classification and retrieval significantly depends on the metric used to compute the similarity between samples. For preserving the geometric structure, the symmetric positive definite (SPD) manifold is introduced into the metric learning problem. However, the SPD constraint is too strict to describe the real data distribution. In this paper, we extend the intrinsic metric learning problem to semi-definite case, by which the data distribution is better described for various classification tasks. First, we formulate the metric learning as a minimization problem to the SPD manifold on subspace, which not only considers to balance the information between inner classes and inter classes by an adaptive tradeoff parameter but also improves the robustness by the low-rank subspaces presentation. Thus, it benefits to design a structure-preserving algorithm on subspace by using the geodesic structure of the SPD subspace. To solve this model, we develop an iterative strategy to update the intrinsic metric and the subspace structure, respectively. Finally, we compare our proposed method with ten state-of-the-art methods on four data sets. The numerical results validate that our method can significantly improve the description of the data distribution, and hence, the performance of the image classification task.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call