Abstract

Bone age determination in individuals is important for the diagnosis and treatment of growing children. This study aimed to develop a deep-learning model for bone age estimation using lateral cephalometric radiographs (LCRs) and regions of interest (ROIs) in growing children and evaluate its performance. This retrospective study included 1050 patients aged 4-18 years who underwent LCR and hand-wrist radiography on the same day at Pusan National University Dental Hospital and Ulsan University Hospital between January 2014 and June 2023. Two pretrained convolutional neural networks, InceptionResNet-v2 and NasNet-Large, were employed to develop a deep-learning model for bone age estimation. The LCRs and ROIs, which were designated as the cervical vertebrae areas, were labeled according to the patient's bone age. Bone age was collected from the same patient's hand-wrist radiograph. Deep-learning models trained with five-fold cross-validation were tested using internal and external validations. The LCR-trained model outperformed the ROI-trained models. In addition, visualization of each deep learning model using the gradient-weighted regression activation mapping technique revealed a difference in focus in bone age estimation. The findings of this comparative study are significant because they demonstrate the feasibility of bone age estimation via deep learning with craniofacial bones and dentition, in addition to the cervical vertebrae on the LCR of growing children.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.