Abstract

Transfer Learning is an effective method of dealing with real-world problems where the training and test data are drawn from different distributions. Transfer learning methods use a labeled source domain to boost the task in a target domain that may be unsupervised or semi-supervised. However, the previous transfer learning algorithms use Euclidean distance or Mahalanobis distance formula to represent the relationships between instances and to try and capture the geometry of the manifold. In many real-world scenarios, this is not enough and these functions fail to capture the intrinsic geometry of the manifold that the data exists in. In this paper, we propose a transfer learning framework called Semi-Supervised Metric Transfer Learning with Relative Constraints (SSMTR), that uses distance metric learning with a set of relative distance constraints that capture the similarities and dissimilarities between the source and the target domains better. In SSMTR, instance weights are learned for different domains which are then used to reduce the domain shift while a Relative Distance metric is learned in parallel. We have developed SSMTR for classification problems as well, and have conducted extensive experiments on several real-world datasets; particularly, the PIE Face, Office-Caltech, and USPS-MNIST datasets to verify the accuracy of our proposed algorithm when compared to the current transfer learning algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call