Abstract

Metric learning is a critical problem in classification. Most classifiers are based on a metric, the simplest one is the KNN classifier, whose outcome is directly decided by the given metric. This paper will discuss semi-supervised metric learning. Most traditional semi-supervised metric learning algorithms preserve the local structure of all the samples (including labeled and unlabeled) in the input space, when making the same labeled samples together and separating different labeled samples. In most existing methods, the local structure is calculated by the Euclidean distance which uses all the features. As we all know, high dimensional data lies on a low dimension manifold, and not all the features are discriminative. Thus, in this paper, we try to explore the latent structure of the samples and use the more discriminative features to calculate the local structure. The latent structure is learned by clustering random forest and cast into similarity between samples. Based on the hierarchical structure of the trees and the split function, the similarity is obtained from discriminant features. Experimental results on public data sets show our algorithm outperforms the traditional similar related algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call