Abstract

BackgroundUse of videos of surgical and medical techniques for educational purposes has grown over the last years. To our knowledge, there is no validated tool to specifically assess the quality of these types of videos. Our goal was to create an evaluation tool and study its intrarater and interrater reliability and its acceptability. We named our tool UM-OSCAARS (Université de Montréal Objective and Structured Checklist for Assessment of Audiovisual Recordings of Surgeries/techniques).MethodsUM-OSCAARS is a grid containing 10 criteria, each of which is graded on an ordinal Likert-type scale of 1 to 5 points. We tested the grid with the help of 4 voluntary otolaryngology – head and neck surgery specialists who individually viewed 10 preselected videos. The evaluators graded each criterion for each video. To evaluate intrarater reliability, the evaluation took place in 2 different phases separated by 4 weeks. Interrater reliability was assessed by comparing the 4 top-ranked videos of each evaluator.ResultsThere was almost-perfect agreement among the evaluators regarding the 4 videos that received the highest scores from the evaluators, demonstrating that the tool has excellent interrater reliability. There was excellent test–retest correlation, demonstrating the tool’s intrarater reliability.ConclusionThe UM-OSCAARS has proven to be reliable and acceptable to use, but its validity needs to be more thoroughly assessed. We hope this tool will lead to an improvement in the quality of technical videos used for educational purposes.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call