Abstract

In this paper, we investigate the problem of sparse online linear classification in changing environments. We first analyze the tracking performance of standard online linear classifiers, which use gradient descent for minimizing the regularized hinge loss. The derived shifting bounds highlight the importance of choosing appropriate step sizes in the presence of concept drifts. Notably, we show that a better adaptability to concept drifts can be achieved using constant step sizes rather than the state-of-the-art decreasing step sizes. Based on these observations, we then propose a novel sparse approximated linear classifier, called sparse approximated linear classification (SALC), which uses a constant step size. In essence, SALC simply rounds small weights to zero for achieving sparsity and controls the truncation error in a principled way for achieving a low tracking regret. The degree of sparsity obtained by SALC is continuous and can be controlled by a parameter which captures the tradeoff between the sparsity of the model and the regret performance of the algorithm. Experiments on nine stationary data sets show that SALC is superior to the state-of-the-art sparse online learning algorithms, especially when the solution is required to be sparse; on seven groups of nonstationary data sets with various total shifting amounts, SALC also presents a good ability to track drifts. When wrapped with a drift detector, SALC achieves a remarkable tracking performance regardless of the total shifting amount.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.