Abstract

This paper addresses an incremental learning problem, in which tasks are learned sequentially without access to the previously trained dataset. Catastrophic forgetting is a significant bottleneck to incremental learning as the network performs poorly on previous tasks when it is trained on a new task. We propose an adaptive model search method that uses a different part of the backbone network depending on an input image to mitigate catastrophic forgetting. Our model search method prevents forgetting by minimizing the update of critical parameters for the previous tasks while learning a new task. This model search involves a trainable network that selects the model structure for an input image. We also propose a method for approximating the loss function of previous tasks without the previous dataset. The critical parameters for the previous tasks can be found, considering the relationship between the approximated loss function and the parameters. The proposed framework is the first method of model search that can consider the performance of both current and previous tasks in the incremental learning problem. Empirical studies and theoretical analysis show that the proposed method outperforms other competitors for old and new tasks while requiring less computation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call