Abstract

A large and balanced training data are the foremost requirement in proper convergence of a deep convolutional neural network (CNN). Medical data always suffer from the problem of unbalancing and inadequacy that makes it difficult to train CNN from scratch. It is known that the transfer learning approach provides great potential to deal with inadequate dataset besides the benefit of faster training. The efficient transfer of knowledge from natural images to histopathological images has yet to be achieved. In view of the foregoing, an attempt has been made toward the classification of BreakHis dataset using pre-trained ‘AlexNet’ model with a suitable fine-tuning approach. The effective depth of fine-tuning is also determined at different levels of magnification (40×, 100×, 200× and 400 ×). The experimental trials conform that the moderate level of fine-tuning is an optimum choice for the classification of magnification-dependent histology images in contrast to the shallow and deep tuning of the pre-trained network which in turn depends on the size and relative distribution of a dataset. Additionally, the layer-wise fine-tuning approach provides a neck-to-neck performance with the latest state-of-the-art developments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call