Abstract

Real-world data is always in stream form. Thereby, there exits two challenges for building incremental deep models: a) Capacity Scalability. b) Capacity Sustainability. To this end, we develop an incremental deep model (IDM). However, IDM ignores another significant challenge with streaming data: c) Capacity Demand. Training a deep model always needs a large amount of labeled data, whereas it is almost impossible to label all unlabeled instances in real time. Thereby, we further improve IDM to a cost-effective incremental deep model (CE-IDM), which can adaptively select the most discriminative newly coming instances for query to reduce the manual labeling costs. Specifically, CE-IDM adopts a novel extensible deep network structure by using an extra attention model for hidden layers. Based on the adaptive attention weights, CE-IDM develops a novel instance selection criterion by jointly estimating unlabeled instances' representative and informative degree to satisfy the capacity demand. With the newly labeled instances, CE-IDM can quickly update the model with adaptive depth from streaming data and enable capacity scalability. Also, we address capacity sustainability by exploiting the attention based fisher information matrix, which can slow down the forgetting in consequence. We conduct extensive experiments on real-world data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call