Abstract

Typical online learning methods have brought fruitful achievements based on the framework of online convex optimization. Meanwhile, nonconvex loss functions also received numerous attentions for their merits of noise-resiliency and sparsity. Current nonconvex loss functions are typically designed as smooth for the ease of designing the optimization algorithms. However, these loss functions no longer have the property of sparse support vectors. In this work, we focus on regression with a special type of nonconvex loss function (i.e., canal loss), and propose a kernel-based online regression algorithm, n̲oise-r̲esilient o̲nline r̲egression (NROR), to deal with the noisy labels. The canal loss is a type of horizontally truncated loss and has the merit of sparsity. Although the canal loss is nonconvex and nonsmooth, the regularized canal loss has a property similar to convexity which is called strong pseudo-convexity. Furthermore, the sublinear regret bound of NROR is proved under certain assumptions. Experimental studies show that NROR achieves low prediction errors in terms of mean absolute error and root mean squared error on the datasets of heavy noisy labels. Particularly, we check whether the convergence assumption strictly holds in practice and find that the assumptions required for convergence are rarely violated, and the convergence rate is not affected.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.