Abstract
The Direct Observation of Procedural Skills (DOPS) form is used as a workplace-based assessment tool in the current Australian and New Zealand College of Anaesthetists curriculum. The objective of this study was to evaluate the reliability of DOPS when used to score trainees performing ultrasound-guided regional anaesthesia. Reliability of an assessment tool is defined as the reproducibility of scores given by different assessors viewing the same trainee. Forty-nine anaesthetists were recruited to score two scripted videos of trainees performing a popliteal sciatic nerve block and an axillary brachial plexus block. Reliability, as measured by intraclass correlation coefficients, was -0.01 to 0.43 for the individual items in DOPS, and 0.15 for the 'Overall Performance for this Procedure' item. Assessors demonstrated consistency of scoring within DOPS, with significant correlation of sum of individual item scores with the 'Overall Performance for this Procedure' item (r=0.78 to 0.80, P<0.001), and with yes versus no responses to the 'Was the procedure completed satisfactorily?' item (W=24, P=0.0004, Video 1, and W=65, P=0.003, Video 2). While DOPS demonstrated a good degree of internal consistency in this setting, inter-rater reliability did not reach levels generally recommended for formative assessment tools. Feasibility of the form could be improved by removing the 'Was the procedure completed satisfactorily?' item without loss of information.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.