Abstract

Due to its efficient model calibration given by unique incremental learning capability, broad learning system (BLS) has made impressive progress in image analytical tasks such as image classification and object detection. Inspired by this incremental remodel success, we proposed a novel transformer-BLS network to achieve a trade-off between model training speed and accuracy. Specially, we developed sub-BLS layers with the multi-head attention mechanism and combining these layers to construct a transformer-BLS network. In particular, our proposed transformer-BLS network provides four different incremental learning algorithms that enable the proposed model can realize the increments of its feature nodes, enhancement nodes, input data and sub-BLS layers, respectively, without the need of the full-weight update in this model. Furthermore, we validated the performance of our transformer-BLS network and its four incremental learning algorithms on a variety of image classification datasets. The results demonstrated that the proposed transformer-BLS maintains classification performance on both the MNIST and Fashion-MNIST datasets, while saving 2/3 of the training time. These findings imply that the proposed method has the potential in significant reducing model training complexity with this incremental remodel system, while simultaneously improving the increment learning performance of the original BLS within such contexts, especially in the classification task of some datasets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call