Abstract

In machine learning, deep neural networks (DNNs) are becoming mainstream because they can learn higher-level features and thus form deep representations. However, DNNs require a lot of memory and training time. Improving the efficiency and effectiveness of D NN training has been an increasingly important focus of research in recent years. In this paper, we propose a training method, Ensemble2Net, that can accelerate the training of deep convolutional neural networks (DCNNs), and help student networks learn knowledge from DCNN-based teacher networks. We use a novel algorithm, Ensemble2Net, to accelerate the transfer of learning in VGGnet (13/16/19), and ResNet. The results show that the Ensemble2Net technique can help VGGnet and ResNet achieve the best accuracy at lower cost than current approaches. In particular, ResNet using Ensemble2Net with 20 epochs achieves better accuracy than the original Res Net trained with more than 170 epochs, with a 1.503x speedup in performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call