Abstract
In this paper, we have presented a novel deep neural network architecture involving transfer learning approach, formed by freezing and concatenating all the layers till block4 pool layer of VGG16 pre-trained model (at the lower level) with the layers of a randomly initialized naïve Inception block module (at the higher level). Further, we have added the batch normalization, flatten, dropout and dense layers in the proposed architecture. Our transfer network, called VGGIN-Net, facilitates the transfer of domain knowledge from the larger ImageNet object dataset to the smaller imbalanced breast cancer dataset. To improve the performance of the proposed model, regularization was used in the form of dropout and data augmentation. A detailed block-wise fine tuning has been conducted on the proposed deep transfer network for images of different magnification factors. The results of extensive experiments indicate a significant improvement of classification performance after the application of fine-tuning. The proposed deep learning architecture with transfer learning and fine-tuning yields the highest accuracies in comparison to other state-of-the-art approaches for the classification of BreakHis breast cancer dataset. The articulated architecture is designed in a way that it can be effectively transfer learned on other breast cancer datasets.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE/ACM Transactions on Computational Biology and Bioinformatics
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.