Abstract

Multi-metric learning is an important technique for improving classification performance since learning a single metric is usually insufficient for complex data. Most of the existing multi-metric learning approaches have high computational complexity. In this work, two multi-metric learning frameworks proposed to perform supervised and semi-supervised classifications respectively. Based on the frameworks, we first design a low-rank multi-metric learning model (LSMML) for supervised classification, in which multiple local class metrics as well as one global metric are jointly trained. A joint regularization scheme, composed of LogDet divergence and low-rank term, is also well-designed to incorporate prior knowledge for improving generalization. By learning appropriate metrics, LSMML not only captures the local nonlinear discriminate information of each class to reduce the probability of misclassification but also enhances the stability, alleviates the computational burden, and avoids the risk of overfitting. Then, we extend LSMML to a semi-supervised learning scenario and propose a low-rank semi-supervised multi-metric learning approach (LSeMML) to process data with scarce labels. Alternating iterative algorithms are designed to optimize both LSMML and LSeMML. At each iteration, we only require to perform geodesic convex optimizations, with closed-form solutions and low computational cost. In supervised and semi-supervised settings respectively, numerical simulations are carried out on different databases, which shows that the proposed LSMML and LSeMML have a simple form, fast training speed, and good classification performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call