Online learning aims to solve a sequence of consecutive prediction tasks by leveraging the knowledge gained from previous tasks. Linearized confidence‐weighted (LCW) learning is the first online learning algorithm introducing the concept of weight confidence into the prediction model through distributions over weights. It provides the flexibility for weights to update their values at different scales. The kernel trick in machine learning can be applied to LCW for a better prediction performance. However, the kernel‐based LCW algorithm is subject to the curse of kernelization which makes it vulnerable to the unlimited growth of the prediction model in runtime and memory consumption. In this study, we present the budgeted LCW (BLCW) algorithm which puts a limit on the growth by a predefined budget with optimization. Consequently, BLCW performs the LCW update and then reduces the information loss by projection. Based on the resource perspective that reinterprets LCW in terms of resources and utilization degrees, we demonstrated that BLCW approximates the kernel‐based LCW algorithm. We evaluate four budget maintenance strategies and suggest that the mean removal is the most stable. By various numerical experiments on real datasets, we demonstrate that BLCW performs competitively and effectively when compared to leading budgeted online algorithms.
Read full abstract