Abstract
One-class classification is a generalization of supervised learning based on one class of examples. It attracts growing attention in machine learning and data mining. In this paper, we propose a novel approach called multi-task dictionary learning for one-class learning (MTD-OC), which incorporates analysis discriminative dictionary learning into one-class learning. The analysis discriminative dictionary learning makes sure that dictionaries responding to different tasks are independent and discriminating as much as possible. The analysis discriminative dictionary learning simultaneously minimize l2,1-norm constraint, analysis incoherence term and sparse code extraction term, which aim to promote analysis incoherence and improve coding efficiency and accuracy for classification. The one-class classifier on the target task is then constructed by learning transfer knowledge from multiple source tasks. Here, one-class classification improves the performance of analysis discriminative dictionary, while analysis discriminative dictionary improves the performance of one-class classification term. In MTD-OC, the optimization function is formulated to deal with one-class classifier and analysis discriminative dictionary learning based on one class of examples. Then, we propose an iterative framework to solve the optimization function, and obtain the predictive classifier for the target class. Extensive experiments have shown that MTD-OC can improve the accuracy of one-class classifier by learning analysis discriminative dictionary from each task to construct a transfer classifier.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have