Abstract

Ensembles of deep convolutional neural networks (CNNs), which integrate multiple deep CNN models to achieve better generalization for an artificial intelligence application, now play an important role in ensemble learning due to the dominant position of deep learning. However, the usage of ensembles of deep CNNs is still not adequate because the increasing complexity of deep CNN architectures and the emerging data with large dimensionality have made the training stage and testing stage of ensembles of deep CNNs inevitably expensive. To alleviate this situation, we propose a new approach that finds multiple models converging to local minima in subparameter space for ensembles of deep CNNs. The subparameter space here refers to the space constructed by a partial selection of parameters, instead of the entire set of parameters, of a deep CNN architecture. We show that local minima found in the subparameter space of a deep CNN architecture can in fact be effective for ensembles of deep CNNs to achieve better generalization. Moreover, finding local minima in the subparameter space of a deep CNN architecture is more affordable at the training stage, and the multiple models at the found local minima can also be selectively fused to achieve better ensemble generalization while limiting the expense to a single deep CNN model at the testing stage. Demonstrations of MobilenetV2, Resnet50 and InceptionV4 (deep CNN architectures from lightweight to complex) on ImageNet, CIFAR-10 and CIFAR-100, respectively, lead us to believe that finding local minima in the subparameter space of a deep CNN architecture could be leveraged to broaden the usage of ensembles of deep CNNs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call