Abstract

ObjectiveThis study aimed to develop a predictive model to detect osteoporosis using radiomic features from lumbar spine computed tomography (CT) images.MethodsA total of 133 patients were included in this retrospective study, 41 men and 92 women, with a mean age of 65.45 ± 9.82 years (range: 31–94 years); 53 had normal bone mineral density, 32 osteopenia, and 48 osteoporosis. For each patient, the L1–L4 vertebrae on the CT images were automatically segmented using SenseCare and defined as regions of interest (ROIs). In total, 1,197 radiomic features were extracted from these ROIs using PyRadiomics. The most significant features were selected using logistic regression and Pearson correlation coefficient matrices. Using these features, we constructed three linear classification models based on the random forest (RF), support vector machine (SVM), and K-nearest neighbor (KNN) algorithms, respectively. The training and test sets were repeatedly selected using fivefold cross-validation. The model performance was evaluated using the area under the receiver operator characteristic curve (AUC) and confusion matrix.ResultsThe classification model based on RF had the highest performance, with an AUC of 0.994 (95% confidence interval [CI]: 0.979–1.00) for differentiating normal BMD and osteoporosis, 0.866 (95% CI: 0.779–0.954) for osteopenia versus osteoporosis, and 0.940 (95% CI: 0.891–0.989) for normal BMD versus osteopenia.ConclusionsThe excellent performance of this radiomic model indicates that lumbar spine CT images can effectively be used to identify osteoporosis and as a tool for opportunistic osteoporosis screening.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call