Abstract

This study presents RRWFP, a novel filter pruning technique for convolutional neural networks (CNNs) designed to improve their deployment on resource-constrained devices. Relevance–Redundancy Filter-Level Weights Pruning (RRWFP) utilises mutual information theory to determine filter relevance. It does this by analysing the mutual information inside filter output activation mappings. This metric helps to find and remove filters based on their redundancy and relevance, achieving a balance that minimises the effect on model accuracy. The empirical evaluations we conducted on CIFAR-10, CIFAR-100, and ImageNet datasets demonstrate the effectiveness of RRWFP. Notably, it achieves minimal accuracy reductions (0.24 % for CIFAR-100 on VGG-16 and 1.01 % for ImageNet on ResNet-50), while significantly reducing model complexity (up to 94.35 % parameter reduction in VGG-16). The results highlight the benefit of incorporating both relevance and redundancy in filter pruning, demonstrating greater performance compared to conventional techniques that address these factors separately.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call