Abstract

BackgroundThere is strong evidence supporting using the Lasater Clinical Judgment Rubric (LCJR) for scoring learners' clinical judgment during in-person simulation performance and clinical experience reflections. However, a gap exists for using LCJR to evaluate clinical judgment after observing asynchronous simulation. ObjectiveWe aimed to determine the reliability, feasibility, and usability of LCJR for scoring learners' written reflections after observing expert-modeled asynchronous simulation videos. Design/setting/participantsWe used a one-group, descriptive design and sampled pre-licensure, junior-level bachelor's learners from the Southwestern United States. MethodsParticipants observed eight expert-modeled asynchronous simulation videos over one semester and provided written responses to clinical judgment prompts. We scored clinical judgment using LCJR. We studied reliability by measuring internal consistency of 11 clinical judgment prompts and interrater reliability with two raters. This study also investigated feasibility and usability of the asynchronous simulation learning activity using descriptive statistics. Feasibility included time learners spent completing written responses and time raters spent evaluating written responses. Learners reported usability perceptions using an instructor-developed survey. ResultsSixty-three learners completed 504 written responses to clinical judgment prompts. Cohen's kappa ranged from 0.34 to 0.86 with a cumulative κ = 0.58. Gwet's AC ranged from 0.48 to 0.90, with a cumulative AC = 0.74. Cronbach's alpha was from 0.51 to 0.72. Learners spent on average 28.32 ± 12.99 min per expert-modeling video observation. Raters spent on average 4.85 ± 1.34 min evaluating written responses for each participant. Learners reported the asynchronous learning activity was usable. ConclusionsNurse educators can reliably use LCJR for scoring learners' clinical judgment after observing asynchronous expert-modeled simulation. Logistically, learners complete the reflective learning activity and faculty use LCJR to measure clinical judgment in feasible time. Further, participants perceived the asynchronous learning activity usable. Nurse educators should utilize this learning activity for evaluating and tracking observer clinical judgment development.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call