Abstract

Purpose: We introduce a fully automated, open-source tool for staging knee osteoarthritis severity from X-ray images. Recent developments in machine learning, specifically in the area of deep learning, are transforming medical image analysis, yet automated analysis of X-ray and Magnetic Resonance Imaging (MRI) data remains a major bottleneck in osteoarthritis research. The Osteoarthritis Initiative (OAI) database presents a unique opportunity to transfer advances in deep learning to osteoarthritis research. Toward that goal, we demonstrate the feasibility of deep learning approaches to automate staging of knee osteoarthritis severity from X-ray data. Methods: We used 7549 fixed-flexion X-ray images (of right and left knees) from the baseline visit of the OAI study, which have been previously graded by expert radiologists using the Kellgren Lawrence (KL) scale. 25% of the data were left out for testing and 75% were used to build the model—this last set was partitioned further into training (75%) and validation (25%) sets. We trained a deep neural network (i.e., faster region convolutional neural network) to achieve the following two objectives: (1) extract the knee-joint region from the X-ray images (Figure 1A) and (2) classify the knee-joint regions using the KL scale (0–4). Our approach achieves both of these objectives together; it first learns to identify potential knee-joint regions using a region-proposal neural network and then classifies these regions using an object-classification neural network. We initialized the model using a pre-trained network (ImageNet) and used a training scheme that alternates between fine-tuning the region-proposal and object-classification networks, because sharing features between these two tasks was expected to improve prediction speed. The region-proposal network was trained using hand-labeled knee regions as the ground truth; we classified a region proposal as successful if the intersection between the predicted and manually selected regions was larger than 75% of the union of these regions. The object-classification network was trained using the KL scores assigned by expert raters. Model parameters were selected based on their performance on the validation set. Here, we report the accuracy of these models on test data, which were not used in any of the training and validation steps. The input for the final model is an X-ray image and the output, computed in less than a few seconds, is a KL score. Results: The region-proposal network predicted the knee joint regions with 99.9% accuracy (Figure 1A). The object-classification network predicted KL scores of 0 with 78.4% accuracy, KL scores of 1 with 32.3% accuracy, KL scores of 2 with 66.7% accuracy, KL scores of 3 with 79% accuracy, and KL scores of 4 with 87% accuracy (Figure 1B). Of all the miss-classifications, 86.8% were within one grade apart from the ground truth. Merging stage 0 and 1 X-rays into a single class, for example, increased the overall classification accuracy from 67.2% to 88.2%. Conclusions: Numerous tools for knee X-ray and MRI analysis remain closed source and not available to the whole community. To accelerate progress toward automated analysis of imaging data, we developed an open-source tool for X-ray classification. While this model can be improved with additional hyperparameter tuning, the current performance is much higher than that of a random five-class classifier (i.e., 20%). Its disagreement with expert radiologists occurs at the boundaries of the KL scale (e.g., distinguishing a KL of 0 from 1), where perfect agreement is not achieved even among expert raters. Future work will focus on comparing the performance of this model with agreement among expert radiologists, reducing the need for large volumes of labeled data, incorporating structured information—such as demographics—to augment performance, and extending these approaches to automated segmentation of MRI data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call