Abstract

The study aims to apply one of the fully connected convolutional neural networks, DenseNet121 network, to a data sample that includes a large group of radiographs through transfer learning technology. Radiography technology is a very important technique in the medical community to detect diseases and abnormalities that may be present, but the interpretation of these images may take a long time and it is subject to error by radiologists who are exposed to external practical factors (such as fatigue resulting from working for long hours, or exhaustion, or thinking about other life matters). To assist radiologists, we have worked on developing a diagnostic model with the help of a deep learning technique to classify radiographic images into two classes: (Normal and Abnormal images), by transferring the selected deep convolutional neural network between a large group of available networks that we studied on the basis of the regions that possibly abnormalities provided by the radiologists for the study sample. We also studied the feasibility of using the well-known VGG16 model on the same data sample and its performance through transfer learning technology and compared its results with the results of the DenseNet121 network. At the end of the research, we obtained a set of good results, which achieved a high diagnostic accuracy of 87.5% in some studied cases, using the DenseNet121 network model, which is considered satisfactory results in the case studied compared to the performance of other models. As for the VGG16 model, it did not give any of the satisfactory results in this field, the accuracy of the classification did not exceed 55% in most cases, and in only two cases it reached about 60% and 62%. The model presented during the research - DenseNet121 model - can be used in the diagnostic process and help in obtaining accurate results in terms of diagnostic results. As for the VGG16 model, it does not give satisfactory results according to the results also obtained during the research, so it is excluded in this type of applications.

Highlights

  • The study aims to apply one of the fully connected convolutional neural networks, DenseNet121 network, to a data sample that includes a large group of radiographs through transfer learning technology

  • To assist radiologists, we have worked on developing a diagnostic model with the help of a deep learning technique to classify radiographic images into two classes: (Normal and Abnormal images), by transferring the selected deep convolutional neural network between a large group of available networks that we studied on the basis of the regions that possibly abnormalities provided by the radiologists for the study sample

  • We also studied the feasibility of using the well-known VGG16 model on the same data sample and its performance through transfer learning technology and compared its results with the results of the DenseNet121 network

Read more

Summary

Introduction

The study aims to apply one of the fully connected convolutional neural networks, DenseNet121 network, to a data sample that includes a large group of radiographs through transfer learning technology. ‫هدفنا هو إظهار قوة التعلم بالنقل في سياق مهام رؤية الكمبيوتر عموم ًا وتصنيف الصور الطبية خصوص ًا‪.‬‬ ‫ففي دراستنا هذه‪ ،‬تم إجراء التصنيف على صور الأشعة السينية لعظام الطرف العلوي الطبيعية وغير الطبيعية في‬ ‫مجموعة بيانات ‪ .MURA‬وكدراسة مختلفة عن الدراسات السابقة‪ ،‬تم استخدام نموذج التعلم العميق القائم على‬ ‫‪ DenseNet121‬عن طريق نقل التعلم باستخدام عينة البيانات ‪ ImageNet‬المدرب عليها النموذج مسبق ًا‪ ،‬كما تم‬ ‫مقارنة النتائج التي تم الحصول عليها مع نتائج تطبيق نموذج التعلم العميق المبني على ‪ VGG16‬والتي تم أيض ًا العمل‬ ‫عليها خلال البحث‪ .‬السبب الرئيس ي لتطبيق التعلم النقل على نماذج ‪ CNN‬المبنية هو المساهمة في كفاءة نماذج‬

Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.