Abstract

Transfer learning has made great achievements in many fields and many excellent algorithms have been proposed. In recent years, many scholars have focused on a new research area called online transfer learning, which is different from general transfer learning. Online transfer learning concentrates on how to build a good classifier on the target domain when the training data arrive in an online/sequential manner. This paper focuses on online transfer learning problem based on a single source domain under homogeneous space. The existing algorithms HomOTL-I and HomOTL-II simply ensemble the classifiers on the source and target domains directly. When the distribution difference between the source domain and the target domain is large, it will not result in a good transfer effect. We are inspired by the idea of the boosting algorithm, that is we could form a strong classification model by a combination of multiple weak classifications. We train multiple classifiers on the source domain in an offline manner using AdaBoost algorithm, combine these classifiers on source domain with the classifier trained in an online manner on the target domain to form multiple weak combination in an ensemble manner. Based on the above ideas, we propose two algorithms AB-HomOTL-I and AB-HomOTLII, which have different ways to adjust the weights. We tested our algorithms on sentiment analysis dataset and 20newsgroup dataset. The results show that our algorithms are superior to other baseline algorithms

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call