Abstract

INTRODUCTION: Automated video-based feedback for a surgeon’s technical skills would improve patient care and facilitate trainee education, but current feedback methods require the development of task-specific metrics. A generalizable, automated method of surgeon skills assessment with surgeon tool-usage signatures could expedite the development and validation of video-based feedback methods. METHODS: Annotated images from the publicly-available SOCAL dataset were used to train an off-the-shelf computer vision model (YOLOv4) which detected surgical instruments. Shannon entropy of unique instruments combinations was calculated from each surgical trial. Logistic regression was used to predict trial success using Shannon entropy. RESULTS: Surgeon signatures based on Shannon entropy were created for each trial and each surgeon. Instrument usage patterns demonstrated differences between successful and unsuccessful trials. Shannon entropy of instrument combinations demonstrated significant correlation with trial success (p < 0.001) and predicted success with 97% average precision and 78% accuracy using computer vision detections. Unsuccessful trials displayed a rapid initial peak in entropy which then declined over time. In contrast, successful trials demonstrated slower progression of tool usage with gradually increasing use of a critical, final maneuver. CONCLUSIONS: Surgeon signatures based on Shannon entropy predicted task success with acceptable accuracy and revealed patterns between successful and unsuccessful trials. Shannon entropy offers a generalizable and summative signal about surgeon performance, regardless of the task, and can predict outcomes based only on the probabilities of unique instrument combinations. Future efforts will describe the instrument movement signatures.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call