BackgroundUpon entering the healthcare system, junior doctors may lack the skills required to care for patients, and feel unprepared for their role, with considerable variation in the level of proficiency in the performance of particular clinical procedures.ObjectiveTo compare the performance and proficiency (self-report and observed) of the performance of nine basic clinical procedures.MethodsSeventeen interns were observed performing nine clinical procedures in a simulated setting in June 2021 (Assessment 1) and January 2022 (Assessment 2). The observers identified whether each step in the procedure was performed correctly, and provided an overall assessment of proficiency. The participants also rated their own level proficiency.ResultsAt Assessment 1 the number of steps performed correctly ranged from a mean of 41.9–83.5%. At Assessment 2 the number of steps performed correctly ranged from a mean of 41.9–97.8%. The most common median proficiency rating for Assessment 1 was ‘close supervision’, and was ‘indirect supervision’ at Assessment 2. There was a significant and large effect size in the improvement in performance from Assessment 1 to Assessment 2. Low correlations were found between observer and self-reported proficiency in performance of the procedures.ConclusionsThe large improvement in performance across the two assessments is encouraging. However, there is a need to address the variability in performance on graduation from medical school, and to ensure that any assessment of proficiency is not only reliant on self-report.
Read full abstract