Abstract

Distance Metric Learning for Large Margin Nearest Neighbor (LMNN), as a classic distance metric learning (DML) method, has attracted much attention among researchers. However, it, like most of the existing DML methods, cannot be guaranteed to achieve independent and shared feature subspaces from multiple sources or different feature subsets, such that much statistical feature information is ignored in model learning. In this paper, we propose a novel DML model, called Multi-view DML Based on Independent and Shared Feature Subspace (MVML-ISFS), which learns multiple distance metrics to unify the information from multiple views. The proposed method finds a distance metric for each view in an independent feature space to preserve its specific property as well as a sparse representation related to the distance metrics from distinct views in a shared feature space to remain their common properties. The objective problem of MVML-ISFS is formulated based on LMNN, thus encouraging a large margin for each view that makes the distance between each of the same class pairs of samples be smaller than that between each of the different class pairs of samples. The proposed model in MVML-ISFS involves multivariate variables, which are optimized by a gradient descent strategy. The experimental results show the effectiveness of our MVML-ISFS on remote sensing, face, forest fire, and UCI datasets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call