Abstract

Oral cancer is a prevalent disease happening in the head and neck region. Due to the high occurrence rate and serious consequences of oral cancer, an accurate diagnosis of malignant oral tumors is a major priority. Thus, early diagnosis is very effective to give the patient a prompt response to treatment. The most efficient way for diagnosing oral cancer is from histopathological imaging, which provides a detailed view of inside cells. Accurate and automatic classification of oral histopathological images remains a difficult task due to the complex nature of cell images, staining methods, and imaging conditions. The use of deep learning in imaging techniques and computational diagnostics can assist doctors and physicians in automatically analysing Oral Squamous Cell Carcinoma biopsy images in a timely and efficient manner. Thus, it reduces the operational workload of the pathologist and enhance patient management. Training deeper neural networks takes considerable time and requires a lot of computing resources, due to the complexity of the network and the gradient diffusion problem. With this motivation and inspired by ResNet's significant successes to handle the gradient diffusion problem, in this study we suggest the novel improved ResNet-based model for the automated multistage classification of oral histopathology images. Three prospective candidate model blocks are presented, analyzed, and the best candidate model is chosen as the optimal one which can efficiently classify the oral lesions into well-differentiated, moderately-differentiated and poorly-differentiated in significantly reduced time, with 97.59% accuracy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.