Abstract

This paper develops a new method for lifelong learning referred to as Sparsify Dynamically Expandable Network (SDEN) via Variational Dropout, which explores a sparse model while preserving the performance. Dynamically Expandable Network (DEN) can learn a sequence of tasks via performing network retraining, network expansion by adding only the necessary neurons, and network split to effectively prevent semantic drift in an online manner. To overcome point estimation of parameters in DEN, Bayesian Compression for DEN is developed under the Bayesian framework. However, this method demands more time for model training and testing. To improve the model efficiency, we propose SDEN under the efficient sparse learning framework. We validate our SDEN in the lifelong learning scenarios with multiple frequently used benchmarks, on which it can obtain comparable classification accuracy, and less training and testing time compared with the comparison methods. Furthermore, our method can also learn a more sparse network structure which means fewer network parameters.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call