Although large‐scale pretrained convolutinal neural networks (CNN) models have shown impressive transfer learning capabilities, they come with drawbacks such as high energy consumption and computational cost due to their potential redundant parameters. This study presents an innovative weight‐level pruning technique that mitigates the challenges of overparameterization, and subsequently minimizes the electricity usage of such large deep learning models. The method focuses on removing redundant parameters while upholding model accuracy. This methodology is applied to classify Eimeria species parasites from fowls and rabbits. By leveraging a set of 27 pretrained CNN models with a number of parameters between 3.0M and 118.5M, the framework has identified a 4.8M‐parameter model with the highest accuracy for both animals. The model is then subjected to a systematic pruning process, resulting in an 8% reduction in parameters and a 421M reduction in floating point operations while maintaining the same classification accuracy for both fowls and rabbits. Furthermore, unlike the existing literature where two separate models are created for rabbits and fowls, this article presents a combined model with 17 classes. This approach has resulted in a CNN model with nearly 50% reduced parameter size while retaining the same accuracy of over 90%.