Abstract

Online learning for multi-class classification is a well-studied topic in machine learning. The standard multi-class classification online learning setting assumes continuous availability of the ground-truth class labels. However, in many real-life applications, only partial feedback of the predicted label can be obtained and only the correctness of the prediction is available. Hence, knowledge of the correct label is missing in the case of erroneous predictions. In this case, learning may be slower and classifiers less accurate than in the full feedback scenario. Although several online learning algorithms with partial feedback have been proposed, real-world applications would still benefit from further performance improvement. In this paper, we exploit transfer learning to improve learning in the case of erroneous predictions. We propose the Partial Feedback Online Transfer Learning (PFOTL) algorithm, which uses learned knowledge from the source domain in addition to received partial feedback, and present analysis and a mistake bound for the algorithm. In our experimental results on four benchmark datasets, the proposed algorithm achieves higher online cumulative accuracy than the comparable state-of-the-art algorithms. Two potential applications of our work would be online recommender systems and privacy protection.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call