Abstract

The efficiency and scalability of online learning methods make them a popular choice for solving the learning problems with big data and limited memory. Most of the existing online learning approaches are based on global models, which consider the incoming example as linear separable. However, this assumption is not always valid in practice. Therefore, local online learning framework was proposed to solve non-linear separable task without kernel modeling. Weights in local online learning framework are based on the first-order information, thus will significantly limit the performance of online learning. Intuitively, the second-order online learning algorithms, e.g., Soft Confidence-Weighted (SCW), can significantly alleviate this issue. Inspired by the second-order algorithms and local online learning framework, we propose a Soft Confidence-Weighted Local Online Learning (SCW-LOL) algorithm, which extends the single hyperplane SCW to the case with multiple local hyperplanes. Those local hyperplanes are connected by a common component and will be optimized simultaneously. We also examine the theoretical relationship between the single and multiple hyperplanes. The extensive experimental results show that the proposed SCW-LOL learns an online convergence boundary, overall achieving the best performance over almost all datasets, without any kernel modeling and parameter tuning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call