Abstract

In this article, we formulate lifelong learning as an online transfer learning procedure over consecutive tasks, where learning a given task depends on the accumulated knowledge. We propose a novel theoretical principled framework, lifelong online learning, where the learning process for each task is in an incremental manner. Specifically, our framework is composed of two-level predictions: the prediction information that is solely from the current task; and the prediction from the knowledge base by previous tasks. Moreover, this article tackled several fundamental challenges: arbitrary or even non-stationary task generation process, an unknown number of instances in each task, and constructing an efficient accumulated knowledge base. Notably, we provide a provable bound of the proposed algorithm, which offers insights on the how the accumulated knowledge improves the predictions. Finally, empirical evaluations on both synthetic and real datasets validate the effectiveness of the proposed algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call