Abstract

The gray-scale ultrasound (US) imaging method is usually used to assess synovitis in rheumatoid arthritis (RA) in clinical practice. This four-grade scoring system depends highly on the sonographer’s experience and has relatively lower validity compared with quantitative indexes. However, the training of a qualified sonographer is expensive and time-consuming while few studies focused on automatic RA grading methods. The purpose of this study is to propose an automatic RA grading method using deep convolutional neural networks (DCNN) to assist clinical assessment. Gray-scale ultrasound images of finger joints are taken as inputs while the output is the corresponding RA grading results. Firstly, we performed the auto-localization of synovium in the RA image and obtained a high precision in localization. In order to make up for the lack of a large annotated training dataset, we performed data augmentation to increase the number of training samples. Motivated by the approach of transfer learning, we pre-trained the GoogLeNet on ImageNet as a feature extractor and then fine-tuned it on our own dataset. The detection results showed an average precision exceeding 90%. In the experiment of grading RA severity, the four-grade classification accuracy exceeded 90% while the binary classification accuracies exceeded 95%. The results demonstrate that our proposed method achieves performances comparable to RA experts in multi-class classification. The promising results of our proposed DCNN-based RA grading method can have the ability to provide an objective and accurate reference to assist RA diagnosis and the training of sonographers.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.