Abstract

In Germany, progress assessments in head and neck ultrasonography training have been carried out mainly theoretically and lack standardisation. Thus, quality assurance and comparisons between certified courses from various course providers are difficult. This study aimed to develop and integrate a direct observation of procedural skills (DOPS) in head and neck ultrasound education and explore the perceptions of both participants and examiners. Five DOPS tests oriented towards assessing basic skills were developed for certified head and neck ultrasound courses on national standards. DOPS tests were completed by 76 participants from basic and advanced ultrasound courses (n = 168 documented DOPS tests) and evaluated using a 7-point Likert scale. Ten examiners performed and evaluated the DOPS after detailed training. The variables of "general aspects" (6.0 Scale Points (SP) vs. 5.9 SP; p = 0.71), "test atmosphere" (6.3 SP vs. 6.4 SP; p = 0.92), and "test task setting" (6.2 SP vs. 5.9 SP; p = 0.12) were positively evaluated by all participants and examiners. There were no significant differences between a basic and advanced course in relation to the overall results of DOPS tests (p = 0.81). Regardless of the courses, there were significant differences in the total number of points achieved between individual DOPS tests. DOPS tests are accepted by participants and examiners as an assessment tool in head and neck ultrasound education. In view of the trend toward "competence-based" teaching, this type of test format should be applied and validated in the future.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call