Abstract

Although large‐scale pretrained convolutinal neural networks (CNN) models have shown impressive transfer learning capabilities, they come with drawbacks such as high energy consumption and computational cost due to their potential redundant parameters. This study presents an innovative weight‐level pruning technique that mitigates the challenges of overparameterization, and subsequently minimizes the electricity usage of such large deep learning models. The method focuses on removing redundant parameters while upholding model accuracy. This methodology is applied to classify Eimeria species parasites from fowls and rabbits. By leveraging a set of 27 pretrained CNN models with a number of parameters between 3.0M and 118.5M, the framework has identified a 4.8M‐parameter model with the highest accuracy for both animals. The model is then subjected to a systematic pruning process, resulting in an 8% reduction in parameters and a 421M reduction in floating point operations while maintaining the same classification accuracy for both fowls and rabbits. Furthermore, unlike the existing literature where two separate models are created for rabbits and fowls, this article presents a combined model with 17 classes. This approach has resulted in a CNN model with nearly 50% reduced parameter size while retaining the same accuracy of over 90%.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.