Abstract
Background and objectiveA significant progress has been made in automated medical diagnosis with the advent of deep learning methods in recent years. However, deploying a deep learning model for mobile and small-scale, low-cost devices is a major bottleneck. Further, breast cancer is more prevalent currently, and ductal carcinoma being its most common type. Although many machine/deep learning methods have already been investigated, still, there is a need for further improvement. MethodThis paper proposes a novel deep convolutional neural network (CNN) based transfer learning approach complemented with structured filter pruning for histopathological image classification, and to bring down the run-time resource requirement of the trained deep learning models. In the proposed method, first, the less important filters are pruned from the convolutional layers and then the pruned models are trained on the histopathological image dataset. ResultsWe performed extensive experiments using three popular pre-trained CNNs, VGG19, ResNet34, and ResNet50. With VGG19 pruned model, we achieved an accuracy of 91.25% outperforming earlier methods on the same dataset and architecture while reducing 63.46% FLOPs. Whereas, with the ResNet34 pruned model, the accuracy increases to 91.80% with 40.63% fewer FLOPs. Moreover, with the ResNet50 model, we achieved an accuracy of 92.07% with 30.97% less FLOPs. ConclusionThe experimental results reveal that the pre-trained model's performance complemented with filter pruning exceeds original pre-trained models. Another important outcome of the research is that the pruned model with reduced resource requirements can be deployed in point-of-care devices for automated diagnosis applications with ease.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.