Abstract

Background and Objective:Computer-aided cervical cancer screening based on an automated recognition of cervical cells has the potential to significantly reduce error rate and increase productivity compared to manual screening. Traditional methods often rely on the success of accurate cell segmentation and discriminative hand-crafted features extraction. Recently, detector based on convolutional neural network is applied to reduce the dependency on hand-crafted features and eliminate the necessary segmentation. However, these methods tend to yield too much false positive predictions.Methods:This paper proposes a global context-aware framework to deal with this problem, which integrates global context information by an image-level classification branch and a weighted loss. And the prediction of this branch is merged into cell detection for filtering false positive predictions. Furthermore, a new ground truth assignment strategy in the feature pyramid called soft scale anchor matching is proposed, which matches ground truths with anchors across scales softly. This strategy searches the most appropriate representation of ground truths in each layer and add more positive samples with different scales, which facilitate the feature learning.Results:Our proposed methods finally get 5.7% increase in mean average precision and 18.5% increase in specificity with sacrifice of 2.6% delay in inference time.Conclusions:Our proposed methods which totally avoid the dependence on segmentation of cervical cells, show the great potential to reduce the workload for pathologists in automation-assisted cervical cancer screening.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.