Abstract

Most existing multilabel classification methods are batch learning methods, which may suffer from expensive retraining costs when dealing with new incoming data. In order to overcome the drawbacks of batch learning, we develop a family of online multilabel classification algorithms, which can update the model instantly and efficiently, and make a timely online prediction when new data arrive. Our algorithms all take a closed-form update, which is obtained by solving a constrained optimization problem in each round of online learning. Label correlation is explicitly modeled in our optimization problem. The label thresholding function, an important component of our online classifier, can also be learned online. Our algorithms can be easily generalized to the nonlinear prediction cases using Mercer kernels. The worst case loss bounds for our algorithms are provided. The bounds are relative to the cumulative loss suffered by the best fixed predictive model that can be attained in hindsight. Finally, we corroborate the merits of our algorithms in both linear and nonlinear predictions on nine open multilabel benchmark datasets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.