Abstract

Serous effusion is a condition of excess accumulation of fluids in serous cavities due to different underlying pathological conditions. The basis of cytopathological assessment of serous effusions is the identification of cells in the fluid based on their morphology and texture. This assessment is a physically and mentally laborious task, and it can also lead to variability among pathologists. In literature, only a small number of feature-based methods are conducted for automated serous cell classification. In this study, a transfer learning with pre-trained deep convolutional neural networks (ConvNets) is proposed to automatically identify 11 different categories of serous cells in effusion cytology. Unlike the methods which rely on the extraction of cellular features such as morphology and texture, this method is an appearance-based machine learning approach. We fine-tuned four pre-trained ConvNet architectures that are AlexNet, GoogleNet, ResNet and DenseNet on the serous cell dataset. To reduce the overfitting effect, we augmented the data by image rotation, translation, and mirroring. The proposed method was evaluated on both original and augmented sets of serous cells derived from a publicly available dataset. Among the four ConvNet models, ResNet and DenseNet obtained the highest accuracies of 93.44% and 92.90%. However, when two models were compared in terms of accuracy and model complexity, ResNet-TL was selected as the best network model. When compared to the results without data augmentation, data augmentation increased the accuracy rate approximately 10%. Results show that higher classification results were achieved than other traditional methods without requiring precise segmentation.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.