Abstract

Transfer learning is a problem that samples are generated from more than one domains, which focuses on transferring knowledge from source tasks to target tasks. A variety of methodologies are proposed for transfer learning. And a number of them concentrate on the inner relationship among each domain while some pay more attention to knowledge transfer. In this paper, based on the hinge loss and SVM, a new dictionary learning with multi-task transfer learning method(DMTTL) is proposed. The dictionary learning method is utilized to learn sparse representation of the given samples. Moreover, a regularization term for two dictionaries are exploited so that the similarity of samples can be well determined. Besides, a new optimization method based on alternate convex search is proposed with convergence analysis, which indicates that the DMTTL is a reasonable approach. After that, the comparison of DMTTL with the state-of-the-art approaches manifests the feasibility and the competitive performance for multi-task classification problem. And the statistic results show that the proposed method outperforms the previous methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call