Abstract

FAST is a point of care ultrasound study that evaluates for the presence of free fluid, typically hemoperitoneum in trauma patients. FAST is an essential skill for Emergency Physicians. Thus, it requires objective evaluation tools that can reduce the necessity of direct observation for proficiency assessment. In this work, we use deep neural networks to automatically assess operators' FAST skills. We propose a deep convolutional neural network for FAST proficiency assessment based on motion data. Prior work has shown that operators demonstrate different domain-specific dexterity metrics that can distinguish novices, intermediates, and experts. Therefore, we augment our dataset with this domain knowledge and employ fine-tuning to improve the model's classification capabilities. Our model, however, does not require specific points of interest (POIs) to be defined for scanning. The results show that the proposed deep convolutional neural network can classify FAST proficiency with 87.5% accuracy and 0.884, 0.886, 0.247 sensitivity for novices, intermediates, and experts, respectively. It demonstrates the potential of using kinematics data as an input in FAST skill assessment tasks. We also show that the proposed domain-specific features and region fine-tuning increase the model's classification accuracy and sensitivity. Variations in probe motion at different learning stages can be derived from kinematics data. These variations can be used for automatic and objective skill assessment without prior identification of clinical POIs. The proposed approach can improve the quality and objectivity of FAST proficiency evaluation. Furthermore, skill assessment combining ultrasound images and kinematics data can provide a more rigorous and diversified evaluation than using ultrasound images alone.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.