Abstract

Background and aim: Diagnosis and treatment of female infertility conditions would help future reproductive planning. Although current deep learning frameworks are able to classify and separately count all types at high accuracy, these solutions suffer from a misclassification error and a high computation complexity due to a positive bias effect and an internal covariate shift. The objective of this paper is to increase the classification accuracy of OFs and to reduce the computational costs of classification via deep learning (DL). Methodology: our framework for follicle classification and counting (FCaC) uses filter-based segmentation. A new method is also proposed to accelerate learning and to normalize the input layer by adjusting and scaling the activations. Our method uses a modified activation function (MAF)- displaced rectifier linear unit (DReLU) and batch normalization (BN) in Feature Extraction and Classification. Therefore, faster and more stable training is achieved by modifying input distribution of an activation function (AF). Results: The proposed system was able to obtain a mean classification accuracy of 97.614%, which is 2.264% more accurate classification than the state-of-the-art. Furthermore, the model processes a single WSI 30% faster (in 10.23 seconds compared to 14.646 seconds processing time of the existing solutions). Conclusion: The proposed system focuses on processing histology images with an accurate classification. It is also faster, has an accelerated convergence and enhanced learning thanks to BN and the EAF. We considered a positive bias effect and internal covariate shift as the main aspects to improve the classification performance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.