Abstract

We present a semi-supervised multitask learning (MTL) framework, where we have multiple partially labeled data manifolds, each defining a classification task for which we wish to design a semi-supervised classifier. These different data sets may be observed simultaneously, or over the sensor lifetime. We propose a soft sharing prior over the parameters of all classifiers and learn all tasks jointly. The soft-sharing prior enables any task to robustly borrow information from related tasks. The semi-supervised MTL combines the advantages of semi-supervised learning and multitask learning, thus further improving the generalization performance of each classifier. Our MTL (or life-long learning) framework is based on our previous semi-supervised learning formulation, termed neighborhood-based classifier (NeBC) [1]. The performance of the semi-supervised MTL is validated by experimental results on several sensing data sets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call