Abstract

Recently, alpha matting has witnessed remarkable growth by wide and deep convolutional neural networks. However, previous deep learning-based alpha matting methods require a high computational cost to be used in real environments including mobile devices. In this letter, a lightweight natural image matting network with a similarity-preserving knowledge distillation is developed. The similarity-preserving knowledge distillation makes pairwise similarities from a compact student network similar to those from a teacher network. The pairwise similarity measured on spatial, channel, and batch units enables to transfer knowledge of the teacher to the student. Based on the similarity-preserving knowledge distillation, we not only design a lighter and smaller student network than the teacher one but also achieve superior performance compared to that of the student network without the knowledge distillation. In addition, the proposed algorithm can be seamlessly applied to various deep image matting algorithms. Therefore, our algorithm is effective for mobile applications ( e . g ., human portrait matting), which are in growing demand. The effectiveness of the proposed algorithm is verified on two public benchmark datasets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call