Abstract

This study aimed to develop a fully automated artificial intelligence-aided cervical vertebral maturation (CVM) classification method based on convolutional neural networks (CNNs) to provide an auxiliary diagnosis for orthodontists. This study consisted of cephalometric images from patients aged between 5 and 18 years. After grouping them into six cervical stages (CSs) by orthodontists, a data set was constructed for analyzing CVM using CNNs. The data set was divided into training, validation, and test sets in the ratio of 70, 15, and 15%. Four CNN models namely, VGG16, GoogLeNet, DenseNet161, and ResNet152 were selected as the candidate models. After training and validation, the models were evaluated to determine which of them is most suitable for CVM analysis. Heat maps were analyzed for a deeper understanding of what the CNNs had learned. The final classification accuracy ranking was ResNet152>DenseNet161>GoogLeNet>VGG16, as evaluated on the test set. ResNet152 proved to be the best model among the four models for CVM classification with a weighted κ of 0.826, an average AUC of 0.933 and total accuracy of 67.06%. The F1 score rank for each subgroup was: CS6>CS1>CS4>CS5>CS3>CS2. The area of the third (C3) and fourth (C4) cervical vertebrae were activated when CNNs were assessing the images. CNN models proved to be a convenient, fast and reliable method for CVM analysis. CNN models have the potential to provide automatic auxiliary diagnostic tools in the future.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.