Abstract
This study aimed to evaluate the performance of deep learning algorithms for the classification and segmentation of impacted mesiodens in pediatric panoramic radiographs. A total of 850 panoramic radiographs of pediatric patients (aged 3-9 years) was included in this study. The U-Net semantic segmentation algorithm was applied for the detection and segmentation of mesiodens in the upper anterior region. For enhancement of the algorithm, pre-trained ResNet models were applied to the encoding path. The segmentation performance of the algorithm was tested using the Jaccard index and Dice coefficient. The diagnostic accuracy, precision, recall, F1-score and time to diagnosis of the algorithms were compared with those of human expert groups using the test dataset. Cohen's kappa statistics were compared between the model and human groups. The segmentation model exhibited a high Jaccard index and Dice coefficient (>90%). In mesiodens diagnosis, the trained model achieved 91-92% accuracy and a 94-95% F1-score, which were comparable with human expert group results (96%). The diagnostic duration of the deep learning model was 7.5 seconds, which was significantly faster in mesiodens detection compared to human groups. The agreement between the deep learning model and human experts is moderate (Cohen's kappa = 0.767). The proposed deep learning algorithm showed good segmentation performance and approached the performance of human experts in the diagnosis of mesiodens, with a significantly faster diagnosis time.
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.