Abstract
Otoscopy is a key clinical examination used by multiple healthcare providers but training and testing of otoscopy skills remain largely uninvestigated. Simulator-based assessment of otoscopy skills exists, but evidence on its validity is scarce. In this study, we explored automated assessment and performance metrics of an otoscopy simulator through collection of validity evidence according to Messick's framework. Novices and experienced otoscopists completed a test program on the Earsi otoscopy simulator. Automated assessment of diagnostic ability and performance were compared with manual ratings of technical skills. Reliability of assessment was evaluated using Generalizability theory. Linear mixed models and correlation analysis were used to compare automated and manual assessments. Finally, we used the contrasting groups method to define a pass/fail level for the automated score. A total of 12 novices and 12 experienced otoscopists completed the study. We found an overall G-coefficient of .69 for automated assessment. The experienced otoscopists achieved a significantly higher mean automated score than the novices (59.9% (95% CI [57.3%-62.6%]) vs. 44.6% (95% CI [41.9%-47.2%]), P < .001). For the manual assessment of technical skills, there was no significant difference, nor did the automated score correlate with the manually rated score (Pearson's r = .20, P = .601). We established a pass/fail standard for the simulator's automated score of 49.3%. We explored validity evidence supporting an otoscopy simulator's automated score, demonstrating that this score mainly reflects cognitive skills. Manual assessment therefore still seems necessary at this point and external video-recording is necessary for valid assessment. To improve the reliability, the test course should include more cases to achieve a higher G-coefficient and a higher pass/fail standard should be used.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.