Abstract

Transfer learning methods have demonstrated state-of-the-art performance on various small-scale image classification tasks. This is generally achieved by exploiting the information from an ImageNet convolution neural network (ImageNet CNN). However, the transferred CNN model is generally with high computational complexity and storage requirement. It raises the issue for real-world applications, especially for some portable devices like phones and tablets without high-performance GPUs. Several approximation methods have been proposed to reduce the complexity by reconstructing the linear or non-linear filters (responses) in convolutional layers with a series of small ones., In this paper, we present a compact CNN transfer learning method for small-scale image classification. Specifically, it can be decomposed into fine-tuning and joint learning stages. In fine-tuning stage, a high-performance target CNN is trained by transferring information from the ImageNet CNN. In joint learning stage, a compact target CNN is optimized based on ground-truth labels, jointly with the predictions of the high-performance target CNN. The experimental results on CIFAR-10 and MIT Indoor Scene demonstrate the effectiveness and efficiency of our proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call