Abstract

To make the overlapping classes more separable, distance metric learning (DML) can learn a distance metric based on both the data features and labeling information of instances. However, most of the existing DML methods are considered as the kNN DML methods and adopt the kNN model to predict instances. The drawback of kNN DML is that all training instances need to be stored and accessed to predict a test instance, and the classification performance is influenced by the setting of nearest neighbor number k. To overcome these problems, there are several DML methods construct the non-kNN multi-class model to predict the instances. But, all of them are non-convex and the convex non-kNN DML method has not been explicitly proposed. In this paper, we propose a convex non-kNN DML model for multi-class classification problem, called DML-based class-to-instance (C2I) confidence (DM-C2IC). Specifically, we learn a specific Mahalanobis distance to make the overlapping classes more separable and then put forward a DML-based C2I confidence model based on it. To capture the correlations between different classes, the DML-based C2I confidence model is trained with the goal that for each instance the DML-based confidence of its corresponding class is larger than the DML-based confidence of every other class. An iterated approach is developed to optimize DM-C2IC, which can converge to the global optimum. Moreover, to further boost the classification performance, a kernel version of DM-C2IC is introduced. Extensive experiments show that the proposed DM-C2IC method outperforms the existing kNN DML methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call