Incremental learning algorithms have been developed as an efficient solution for fast remodeling in Broad Learning Systems (BLS) without a retraining process. Even though the structure and performance of broad learning are gradually showing superiority, private data leakage in broad learning systems is still a problem that needs to be solved. Recently, Multiparty Secure Broad Learning System (MSBLS) is proposed to allow two clients to participate training. However, privacy-preserving broad learning across multiple clients has received limited attention. In this paper, we propose a Self-Balancing Incremental Broad Learning System (SIBLS) with privacy protection by considering the effect of different data sample sizes from clients, which allows multiple clients to be involved in the incremental learning. Specifically, we design a client selection strategy to select two clients in each round by reducing the gap in the number of data samples in the incremental updating process. To ensure the security under the participation of multiple clients, we introduce a mediator in the data encryption and feature mapping process. Three classical datasets are used to validate the effectiveness of our proposed SIBLS, including MNIST, Fashion and NORB datasets. Experimental results show that our proposed SIBLS can have comparable performance with MSBLS while achieving better performance than federated learning in terms of accuracy and running time.
Read full abstract