Immersive virtual reality versus in-person performance assessment in undergraduate medical education – an exploratory comparison

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

Abstract Virtual reality (VR) has emerged as a promising tool in medical education, and may enhance learning. VR could also be used in practical exams, such as Objective Structured Clinical Examinations (OSCEs). However, there have been few studies on its use in assessments. This study aimed to compare a VR OSCE station with a corresponding in-person OSCE station for fifth-year medical students and covered workload, fairness, realism, and student performance. The study also included VR-related side effects, usability, technology acceptance, and the technical feasibility of the VR station. An emergency medicine station was adapted for the mandatory OSCE and replicated in VR. Fifth year medical students who encountered this station during the OSCE could participate in the OSCE group, while others could opt for the VR group. Performance was assessed using the identical checklist. All other metrics were assessed using questionnaires, while any technical issues were documented separately. The VR was rated on par with the in-person OSCE station in workload, fairness, and realism. The performance score of the students in the VR OSCE was lower. The VR OSCE received positive feedback for usability and technology acceptance, with minimal side effects. Most technical challenges during the VR station could be resolved, and all students completed the VR OSCE station. This study compared an in-person OSCE station with a content identical VR OSCE station. Our findings suggest it is technically feasible to assess students using VR, but its limitations and applicability must be considered within the broader context of performance assessment.

Similar Papers
  • Research Article
  • Cite Count Icon 5
  • 10.2196/55066
Comparing Virtual Reality-Based and Traditional Physical Objective Structured Clinical Examination (OSCE) Stations for Clinical Competency Assessments: Randomized Controlled Trial.
  • Jan 10, 2025
  • Journal of medical Internet research
  • Tobias Mühling + 4 more

Objective structured clinical examinations (OSCEs) are a widely recognized and accepted method to assess clinical competencies but are often resource-intensive. This study aimed to evaluate the feasibility and effectiveness of a virtual reality (VR)-based station (VRS) compared with a traditional physical station (PHS) in an already established curricular OSCE. Fifth-year medical students participated in an OSCE consisting of 10 stations. One of the stations, emergency medicine, was offered in 2 modalities: VRS and PHS. Students were randomly assigned to 1 of the 2 modalities. We used 2 distinct scenarios to prevent content leakage among participants. Student performance and item characteristics were analyzed, comparing the VRS with PHS as well as with 5 other case-based stations. Student perceptions of the VRS were collected through a quantitative and qualitative postexamination online survey, which included a 5-point Likert scale ranging from 1 (minimum) to 5 (maximum), to evaluate the acceptance and usability of the VR system. Organizational and technical feasibility as well as cost-effectiveness were also evaluated. Following randomization and exclusions of invalid data sets, 57 and 66 participants were assessed for the VRS and PHS, respectively. The feasibility evaluation demonstrated smooth implementation of both VR scenarios (septic and anaphylactic shock) with 93% (53/57) of students using the VR technology without issues. The difficulty levels of the VRS scenarios (septic shock: P=.67; anaphylactic shock: P=.58) were comparable to the average difficulty of all stations (P=.68) and fell within the reference range (0.4-0.8). In contrast, VRS demonstrated above-average values for item discrimination (septic shock: r'=0.40; anaphylactic shock: r'=0.33; overall r'=0.30; with values >0.3 considered good) and discrimination index (septic shock: D=0.25; anaphylactic shock: D=0.26; overall D=0.16, with 0.2-0.3 considered mediocre and <0.2 considered poor). Apart from some hesitancy toward its broader application in future practical assessments (mean 3.07, SD 1.37 for VRS vs mean 3.65, SD 1.18 for PHS; P=.03), there were no other differences in perceptions between VRS and PHS. Thematic analysis highlighted the realistic portrayal of medical emergencies and fair assessment conditions provided by the VRS. Regarding cost-effectiveness, initial development of the VRS can be offset by long-term savings in recurring expenses like standardized patients and consumables. Integration of the VRS into the current OSCE framework proved feasible both technically and organizationally, even within the strict constraints of short examination phases and schedules. The VRS was accepted and positively received by students across various levels of technological proficiency, including those with no prior VR experience. Notably, the VRS demonstrated comparable or even superior item characteristics, particularly in terms of discrimination power. Although challenges remain, such as technical reliability and some acceptance concerns, VR remains promising in applications of clinical competence assessment.

  • Research Article
  • Cite Count Icon 17
  • 10.1080/10401334.2017.1279057
The Associations Between Clerkship Objective Structured Clinical Examination (OSCE) Grades and Subsequent Performance
  • Mar 2, 2017
  • Teaching and Learning in Medicine
  • Ting Dong + 6 more

ABSTRACTConstruct: We investigated the extent of the associations between medical students' clinical competency measured by performance in Objective Structured Clinical Examinations (OSCE) during Obstetrics/Gynecology and Family Medicine clerkships and later performance in both undergraduate and graduate medical education. Background: There is a relative dearth of studies on the correlations between undergraduate OSCE scores and future exam performance within either undergraduate or graduate medical education and almost none on linking these simulated encounters to eventual patient care. Of the research studies that do correlate clerkship OSCE scores with future performance, these often have a small sample size and/or include only 1 clerkship. Approach: Students in USU graduating classes of 2007 through 2011 participated in the study. We investigated correlations between clerkship OSCE grades with United States Medical Licensing Examination Step 2 Clinical Knowledge, Clinical Skills, and Step 3 Exams scores as well as Postgraduate Year 1 program director's evaluation scores on Medical Expertise and Professionalism. We also conducted contingency table analysis to examine the associations between poor performance on clerkship OSCEs with failing Step 3 and receiving poor program director ratings. Results: The correlation coefficients were weak between the clerkship OSCE grades and the outcomes. The strongest correlations existed between the clerkship OSCE grades and the Step 2 CS Integrated Clinical Encounter component score, Step 2 Clinical Skills, and Step 3 scores. Contingency table associations between poor performances on both clerkships OSCEs and poor Postgraduate Year 1 Program Director ratings were significant. Conclusions: The results of this study provide additional but limited validity evidence for the use of OSCEs during clinical clerkships given their associations with subsequent performance measures.

  • Research Article
  • Cite Count Icon 13
  • 10.5144/0256-4947.2008.192
Objective structured clinical examinations as an assessment method in residency training: practical considerations.
  • May 1, 2008
  • Annals of Saudi Medicine
  • Mohammed Hijazi + 1 more

Objective structured clinical examinations as an assessment method in residency training: practical considerations.

  • Research Article
  • Cite Count Icon 11
  • 10.1046/j.1365-2923.2003.01564.x
Evaluating the outcomes of undergraduate medical education
  • Jun 27, 2003
  • Medical Education
  • Diana F Wood

Evaluating the outcomes of undergraduate medical education

  • Research Article
  • 10.3390/ime4030025
Reliability and Sources of Variation of Preclinical OSCEs at a Large US Osteopathic Medical School
  • Jul 5, 2025
  • International Medical Education
  • Martin Schmidt + 2 more

The objective structured clinical examination (OSCE) is a well-established tool for assessing clinical skills, providing reliability, validity, and generalizability for high-stakes examinations. Des Moines University College of Osteopathic Medicine (DMU-COM) adapted the OSCE for formative assessments in undergraduate medical education, focusing on interpersonal aspects in the primary care setting. Students are graded by standardized patients and faculty observers on interpersonal skills, history/physical examination, oral case presentation, and documentation. The purpose of the study is to establish the reliability and to identify sources of variation in the DMU-COM OSCE to aid medical educators in their understanding of the accuracy of clinical skills. We examined student performance data across five OSCE domains. We assessed intra- and inter-OSCE reliability by calculating KR20 values, determined sources of variation by multivariate regression analysis, and described relationships among observed variables through factor analysis. The results indicate that the OSCE captures student performance in three dimensions with low intra-OSCE reliability but acceptable longitudinal inter-OSCE reliability. Variance analysis shows significant measurement error in rubric-graded scores but negligible error in checklist-graded portions. Physical exam scores from patients and faculty showed no correlation, indicating value in having two different observers. We conclude that a series of formative OSCEs is a valid tool for assessing clinical skills in preclinical medical students. However, the low intra-assessment reliability cautions against using a single OSCE for summative clinical skills competency assessments.

  • Research Article
  • Cite Count Icon 1
  • 10.4300/jgme-d-19-00224
Introducing the Objective Structured Clinical Examination in Haiti.
  • Aug 1, 2019
  • Journal of Graduate Medical Education
  • Ornella Sainterant + 2 more

Haiti, a nation of approximately 10.7 million people located in the western Caribbean,1 is a low-income country, with 59% of its population living below the poverty line (less than $2.41 per day), and 24% living in conditions of extreme poverty (less than $1.23 per day).2 Haiti has a history of political instability and natural disasters, and remains the country with the highest rate of poverty in the Americas.2The health care system in Haiti is regulated by the Ministry of Public Health and Population (MSPP). MSPP is under-resourced, spending only US $13 per person on health care each year. This represents a mere 6.1% of the national budget, and is significantly less than its neighboring countries of Cuba (US $781) and the Dominican Republic (US $180).3Six universities in Haiti offer degree programs in medicine. Upon graduation, medical school graduates are required to complete 1 year of social service, and then can practice medicine independently or apply for entrance into one of Haiti's 37 residency programs.Zanmi Lasante (ZL), a sister organization to Boston, Massachusetts-based Partners In Health, in partnership with MSPP, offers 1 residency program at Hospital St. Nicolas (HSN) in Saint-Marc and 5 programs at Hôpital Universitaire de Mirebalais (HUM) in Mirebalais. The programs that have been accepted are in the preapproval phase with the Accreditation Council for Graduate Medical Education International (ACGME-I).As part of its system of assessment in 2015, we introduced the objective structured clinical examination (OSCE) in the family medicine residency program at HSN, with the help of 2 Canadian fellows who volunteered in Haiti. Faculty members were trained on the new technique and were given opportunities to practice their new skills. These “pioneers” then trained other faculty at HUM. Although widely used in high-income countries, ZL was the first institution in Haiti to use the OSCE.The OSCE is a group of tests that includes a succession of stations with simulated clinical problems, involving standardized patients or mannequins, that learners need to solve in a limited time. Each station has clearly defined objectives, and a checklist for the evaluation of the candidates.4 It is considered the gold standard for evaluating clinical competencies, including the physician-patient relationship, the physical examination, and interpersonal and communication skills (box).4In 2016, the Director of Graduate Medical Education at ZL implemented an OSCE as a pre-assessment tool during the orientation month at the start of postgraduate year 1 (PGY-1). Based on this positive experience, the Graduate Medical Education Committee (GMEC) voted to implement the OSCE as part of the recruitment process for all ZL programs. In 2017, we conducted the first OSCE session during recruitment, with 90 candidates who aspired to enter 1 of the 6 ZL residency programs.Each session consisted of 4 themed stations with different objectives and a break station. The duration of each station was 10 minutes. The OSCE was scored by faculty, the training and research department director, and representatives from the school of medicine.The OSCE was a new evaluation tool for medical educators in Haiti, and implementation faced challenges. We needed to train faculty and the standardized patients (SPs). Because of a lack of funds, we used medical students and residents as SPs. The use of medical students as SPs was helpful in multiple ways, allowing the students to learn clinical portrayal and trainee performance. Candidates and student SPs were nervous at first, but by the end of the day, all said they had benefited from the experience.We are currently testing the OSCE as an evaluation tool in 2 residency programs (anesthesiology and pediatrics), and we intend to implement it in all ZL residency programs.

  • Research Article
  • Cite Count Icon 87
  • 10.1111/medu.12801
Revisiting 'Assessment of clinical competence using an objective structured clinical examination (OSCE)'.
  • Mar 15, 2016
  • Medical Education
  • Ronald M Harden

Revisiting 'Assessment of clinical competence using an objective structured clinical examination (OSCE)'.

  • Research Article
  • Cite Count Icon 7
  • 10.1186/s12909-021-02650-7
Development and evaluation of a spiral model of assessing EBM competency using OSCEs in undergraduate medical education
  • Apr 10, 2021
  • BMC Medical Education
  • B Kumaravel + 2 more

BackgroundMedical students often struggle to understand the relevance of Evidence Based Medicine (EBM) to their clinical practice, yet it is a competence that all students must develop prior to graduation. Objective structured clinical examinations (OSCEs) are a valued assessment tool to assess critical components of EBM competency, particularly different levels of mastery as they progress through the course. This study developed and evaluated EBM based OSCE stations with an aim to establish a spiral approach for EBM OSCE stations for undergraduate medical students.MethodsOSCE stations were developed with increasingly complex EBM tasks. OSCE stations were classified according to the classification rubric for EBP assessment tools (CREATE) framework and mapped against the recently published core competencies for evidence-based practice (EBP). Performance data evaluation was undertaken using Classical Test Theory analysing mean scores, pass rates, and station item total correlation (ITC) using SPSS.ResultsSix EBM based OSCE stations assessing various stages of EBM were created for use in high stakes summative OSCEs for different year groups across the undergraduate medical degree. All OSCE stations, except for one, had excellent correlation coefficients and hence a high reliability, ranging from 0.21–0.49. The domain mean score ranged from 13.33 to 16.83 out of 20. High reliability was demonstrated for the each of the summative OSCE circuits (Cronbach’s alpha = 0.67–0.85).In the CREATE framework these stations assessed knowledge, skills, and behaviour of medical students in asking, searching, appraising, and integrating evidence in practice. The OSCE stations were useful in assessing six core evidence-based practice competencies, which are meant to be practiced with exercises. A spiral model of OSCEs of increasing complexity was proposed to assess EBM competency as students progressed through the MBChB course.ConclusionsThe use of the OSCEs is a feasible method of authentically assessing leaner EBM performance and behaviour in a high stakes assessment setting. Use of valid and reliable EBM-based OSCE stations provide evidence for continued development of a hierarchy of assessing scaffolded learning and mastery of EBM competency. Further work is needed to assess their predictive validity.

  • Research Article
  • 10.53106/199044282025045901006
Applying Virtual Reality to Develop an Objective Structured Clinical Examination for Augmentative and Alternative Communication Selection
  • Apr 1, 2025
  • 教育研究學報
  • 郭雅雯 郭雅雯

&lt;p&gt;客觀結構式臨床技能測驗已實證為具良好信效度的臨床實作測驗,但仍少見應用在評量語言治療與特殊教育專業能力。本研究藉 OSCE 測驗模式來訓練與評量 AAC 選用臨床能力,並應用虛擬實境將「國民教育階段學生 AAC 選用流程」發展為「VR 版AAC 選用 OSCE」。實驗參與者為特教教師 17 人,語言治療師 10 人,語言治療系學生16 人,在接受 AAC 選用訓練與 OSCE 評量後填寫回饋問卷,經描述統計與獨立樣本平均數差異 t 檢定、單因子變異數分析問卷結果,顯示量化與質性上均有正向回饋,而不同年資與職業的實驗參與者,對「VR版 AAC 選用 OSCE」品質、信效度、組織之部分看法存在顯著差異。由研究結果可知「VR版 AAC選用 OSCE」是可行的。期待未來能繼續應用與發展其他專業臨床技能之 VR 版 OSCE 測驗。&lt;/p&gt; &lt;p&gt;&amp;nbsp;&lt;/p&gt;&lt;p&gt;Augmentative and alternative communication (AAC) can benefit students with communication needs. However, speech therapists and special education teachers lack the confidence required to use AAC. Objective structured clinical examinations (OSCEs), which are regarded as highly reliable clinical competence assessment models, are rarely used in speech therapy and special education professional competence assessments. This study evaluated AAC competence among speech therapists and special education teachers by using an OSCE. The study also developed a virtual reality (VR) training program based on the AAC selection process. Participants were 16 speech therapy students, 10 clinical speech therapists, and 17 special education teachers. After receiving training on AAC selection and OSCE use, the participants completed a questionnaire. Questionnaires responses were analyzed using descriptive and inferential statistics, including independent-samples t tests and one-way analysis of variance. Although the participants provided positive feedback, significant differences in opinion were observed between participants of different seniority levels and professions regarding the quality, reliability, and organization of the VR training program. Overall, the VR training program was demonstrated to be a feasible training tool. In the future, we intend to continue applying and developing VR training programs for other professional clinical skills.&lt;/p&gt; &lt;p&gt;&amp;nbsp;&lt;/p&gt;

  • Discussion
  • Cite Count Icon 1
  • 10.1213/ane.0000000000005556
In Response.
  • Jun 15, 2021
  • Anesthesia &amp; Analgesia
  • David O Warner + 15 more

In Response.

  • Preprint Article
  • 10.2196/preprints.69428
Immersive Virtual Reality and AI (Generative Pretrained Transformer) to Enhance Student Preparedness for Objective Structured Clinical Examinations: Mixed Methods Study (Preprint)
  • Dec 16, 2024
  • Shaniff Esmail + 1 more

BACKGROUND Immersive virtual reality (VR) and artificial intelligence have been used to determine whether a simulated clinical exam setting can reduce anxiety in first-year occupational therapy students preparing for objective structured clinical examinations (OSCEs). Test anxiety is common among postsecondary students, leading to negative outcomes such as increased dropout risk, lower grades, and limited employment opportunities. Students unfamiliar with specific testing environments are particularly prone to anxiety. VR simulations of OSCEs may allow students to become familiar with the exam setting and reduce anxiety. OBJECTIVE This study aimed to assess the efficacy of a VR simulation depicting clinical settings to reduce student anxiety about a clinical exam while gathering perspectives on their first-year coursework experiences to better understand their learning environment. METHODS An experimental, nonrandomized controlled trial compared state anxiety, trait test anxiety, and OSCE grades in 2 groups of first-year occupational therapy students analyzed using independent &lt;i&gt;t&lt;/i&gt; tests (2-tailed). Group 1 (NoVR) was not exposed to the VR simulation and acted as a control group for group 2 (YesVR), who were exposed to the VR simulation. The VR used artificial intelligence in the form of a generative pretrained transformer to generate responses from virtual patients as students interacted with them in natural language. Self-reported psychometric scales measured anxiety levels 3 days before the OSCE. YesVR students completed perceived preparation surveys at 2 time points—3 weeks and 3 days before the OSCE—analyzed using dependent &lt;i&gt;t&lt;/i&gt; tests. Semistructured interviews and focus groups were conducted within 1 week after the OSCE. Student perspectives on their classes and VR experiences were summarized using interpretative thematic analysis. RESULTS In total, 60 students—32 (53%) in the NoVR group and 28 (47%) in the YesVR group—participated in the study, and the YesVR group showed a significant reduction in state anxiety (&lt;i&gt;t&lt;/i&gt;&lt;sub&gt;58&lt;/sub&gt;=3.96; &lt;i&gt;P&lt;/i&gt;&amp;lt;.001; Cohen &lt;i&gt;d&lt;/i&gt;=1.02). The mean difference was 11.96 units (95% CI 5.92-18.01). Trait test anxiety and OSCE scores remained static between groups. There was an increase in all perceived preparedness variables in the YesVR group. In total, 42% (25/60) of the participants took part in interviews and focus groups, providing major themes regarding factors that affect OSCE performance, including student experience and background, feedback and support, fear of unknown, self-consciousness, and knowledge of the exam environment. CONCLUSIONS Intolerance of uncertainty may lead students to interpret ambiguous exam situations as overly precarious. Findings suggest that VR simulation was associated with reduced state anxiety, although results from this small, nonrandomized sample should be interpreted cautiously. Qualitative data indicated that VR helped students gain familiarity with clinical exam settings, potentially decreasing uncertainty-based anxiety. Future research with larger or randomized samples is needed to confirm these findings and explore advanced VR tools offering feedback to enhance learning.

  • Research Article
  • 10.1097/00001888-200407001-00016
Indiana University School of Medicine.
  • Jul 1, 2004
  • Academic medicine : journal of the Association of American Medical Colleges
  • Glenda R Westmoreland + 4 more

Indiana University School of Medicine.

  • Research Article
  • Cite Count Icon 20
  • 10.1016/j.jsurg.2016.08.018
Surgery Clerkship Evaluations Are Insufficient for Clinical Skills Appraisal: The Value of a Medical Student Surgical Objective Structured Clinical Examination
  • Sep 28, 2016
  • Journal of Surgical Education
  • Kathryn L Butler + 9 more

Surgery Clerkship Evaluations Are Insufficient for Clinical Skills Appraisal: The Value of a Medical Student Surgical Objective Structured Clinical Examination

  • Discussion
  • Cite Count Icon 5
  • 10.1016/j.amjmed.2022.01.001
AAIM Recommendations to Improve Learner Transitions
  • Jan 14, 2022
  • The American Journal of Medicine
  • Kristen Lewis + 7 more

AAIM Recommendations to Improve Learner Transitions

  • Research Article
  • Cite Count Icon 20
  • 10.1097/aln.0000000000000067
Objective Structured Clinical Examination and Board Certification in Anesthesiology
  • Jan 1, 2014
  • Anesthesiology
  • James P Rathmell + 2 more

Objective Structured Clinical Examination and Board Certification in Anesthesiology

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.

Search IconWhat is the difference between bacteria and viruses?
Open In New Tab Icon
Search IconWhat is the function of the immune system?
Open In New Tab Icon
Search IconCan diabetes be passed down from one generation to the next?
Open In New Tab Icon