Abstract

Background: The structure and content of Rheumatology training programs vary widely among European countries. Harmonization of assessment methods of competences across EULAR countries could contribute to ensure a minimal standard of care. Objectives: To identify and review the evidence on competence assessment methods and strategies in postgraduate medical training in rheumatology and other specialties. Methods: As part of the EULAR project to develop points to consider on assessment of competences in rheumatology training, a systematic literature review (SLR) was performed. Two reviewers (AA and AN) independently identified eligible studies according to the PIM framework: P (population): trainees, fellows; I (instrument of interest): assessment strategies and methods; M (measurement of properties of interest): validity, discrimination, feasibility. Two searches were conducted: (i) for rheumatology, retrieving original studies; (ii) for related medical specialties, retrieving SLRs through which we identified original studies. Risk of bias was assessed using the medical education research study quality instrument (MERSQI) and the tool by Daly et al for qualitative studies. Studies were too heterogeneous to allow for any form of pooling, so descriptive results are presented. Results: Of the 6276 articles from the rheumatology search, 4 met the inclusion criteria; of the 2,265 SLRs in other specialties, 36 were included, corresponding to a total of 133 original studies included. Studies on the assessment of competences in rheumatology were at variable risk of bias and explored only 2 methods: direct observation of practical skills (DOPS) and objective structured clinical examination (OSCE) (Table 1). Rheumatology OSCEs have been used to assess clinical and communication skills, professionalism and practical skills on musculoskeletal ultrasound, with conflicting evidence on internal consistency, reliability and inter-rater reliability. However, OSCEs including clinical, laboratory and imaging stations performed best, with a good to very good internal consistency (Cronbach’s α = 0.83-0.92) and inter-rater reliability (r correlation coefficient= 0.80-0.95). A fair to moderate correlation (r= 0.44-0.52) between OSCEs and other assessment tools, including DOPS, has been found. The study on DOPS but not those on OSCE provided evidence for feasibility. Studies in other specialties were more heterogeneous for strategy/tools investigated, type and comprehensiveness of the analysis. The majority of studies on OSCEs to assess clinical skills showed a good to very good inter-rater reliability (r=0.60-0.95), while those on OSCEs to assess communication skills consistently demonstrated a good to very good internal consistency (Cronbach’s α=0.7-0.98). Other tools such as multisource feedback (MSF) and mini-clinical evaluation exercise (mini-CEX) showed feasibility and a good to very good internal consistency, but results on validity and reliability were conflicting. Conclusion: Although there is a consistent body of evidence about assessment of competence in postgraduate medical training in several specialities, data in rheumatology is scarce and this partial picture indicates some conflicting evidence. OSCEs represent an appropriate tool to assess clinical competences and correlate fairly well with other assessment strategies; DOPS, MSF and mini-CEX are other feasible alternatives. A mapping of European countries and a qualitative study will be additionally performed. Disclosure of Interests: Alessia Alunno: None declared, Aurelie Najm: None declared, Francisca Sivera: None declared, Catherine Haines: None declared, Sofia Ramiro Grant/research support from: MSD, Consultant for: AbbVie, Lilly, MSD, Novartis, Pfizer, Sanofi, Speakers bureau: AbbVie, Lilly, MSD, Novartis, Pfizer, Sanofi

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call