Abstract

Diagnosis of cervical dysplasia by visual inspection is a difficult problem. Most of the recent approaches use deep learning techniques to extract features and detect a region of interest (RoI) in the image. Such approaches can lead to loss of visual detail that appears weak and local within the cervical image. Also, it requires manual annotation to extract the RoI. Moreover, there are not many labeled data due to the medical image’s characteristics. To mitigate the problem, we present an approach that extracts global and local features in the image without manual annotation when there is a shortage of data. The proposed approach is applied to classify cervix cancer, and the results are demonstrated. First of all, we divide the cervix image into nine patches to extract visual features when high-resolution images are unavailable. Second, we generate a deep learning model sharing a weight between patches of the image by considering the patch-patch and patch-image relationship. We also apply an attention mechanism to the model to train the visual features of the image and show an interpretable result. Finally, we add a loss weighting inspired by the domain knowledge to the training process, which guides the learning better while preventing overfitting. The evaluation results indicate improvements over the state-of-the-art methods in sensitivity.KeywordsDeep learning modelPatch-weight sharingAttentionLoss weightingCervical dysplasia

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.