Abstract

See Article, page 220 It has been >40 years since the term “competency-based medical education” (CBME) first appeared in a forward-thinking report prepared for the World Health Organization.1 This landmark report described a curriculum that was organized around defined clinical competencies that were acquired through a mastery learning approach. The goal of this new curriculum was to produce a better health professional who could practice proficiently across the breadth of medicine and contextualize care to meet local patient needs.1 CBME was described as an outcome-based approach, in contrast to the traditional time-based medical training approach. The competency-focused outcomes were designed to prepare physicians with all the expected competencies, beyond medical knowledge only, to better serve society.2 As we celebrate the centenary edition of Anesthesia & Analgesia, it should be recognized that CBME represents only a small portion of the advances in graduate medical education. The time-based training with which most anesthesiologists are familiar dates back to the Flexner report,3 released a decade before the first issue of this journal. Flexner’s survey of the state of medical education at the turn of the 20th century led to a standardization of medical school curricula. Out of that report also arose the concept of clerkships and the necessity for innovations in medical education to support the advancements in medicine. CBME may represent the evolution in medical education of the 21st century. The goal of a postgraduate CBME curriculum in anesthesiology is to explicitly outline, teach, assess, and provide feedback to residents so they can graduate with the competencies to ensure safe and effective anesthesiology practices.4 Today, the national standards of postgraduate anesthesiology training have embraced a CBME model in Canada, the United States, Australia, New Zealand, the United Kingdom, and Ireland.5–9 With this expanding global experience with CMBE, have the theoretical advantages to this approach translated into practice? Moreover, are there any unintended consequences to this paradigm shift in medical education? In this article, we highlight the progress that has been made toward the integration of CBME curricula into the postgraduate training of anesthesiologists. This article outlines several of the benefits of a transition to CBME and explores its challenges to programs, faculty, and residents. We also discuss important areas for the future development of medical education, including innovations in resident assessments and considerations to ensure training programs remain relevant and well positioned to prepare anesthesiologists to meet future health care needs. WHERE IS CBME NOW? Several countries have concurrently developed requirements and standards for CBME anesthesiology training. A recent comparison of the competency frameworks of CBME programs in Europe, the United States, and Canada demonstrated that >90% of the clinical competencies that were identified as necessary skills were common to all 3 regions.10 Core anesthesia competencies, for example, perioperative anesthesia care, managing critically ill patients in acute care settings and airway management, demonstrated a high degree of overlap and importance between regions. Conversely, specific competencies, such as navigating health care systems and incorporating practice-based learning, varied between countries, with particular emphasis in the United States, and were contextualized to culture and practice.10 The commonalities articulated by CBME training programs that are located in different countries provide an opportunity to explore international competency standards for anesthesiology postgraduate training. In addition, a universal consensus regarding the necessary core clinical competencies could facilitate potential reciprocity of postgraduate training between jurisdictions and portability to practice. Entrustable professional activities (EPAs) have supported CBME by organizing trainee assessments around specific and tangible clinical encounters. These clinical activities can be observed and fully entrusted to trainees to perform unsupervised after all the required knowledge, skills, and behaviors of the encounter are achieved. EPAs can enable assessment in CBME by mapping to all the desired competencies.11 Furthermore, they can be scaffolded for demonstrable progression of increased resident autonomy and responsibility in patient care on the path to independent practice.12 The Royal College of Physicians and Surgeons of Canada Competence by Design framework for anesthesiology training was implemented nationally in 2017. Currently, the curriculum includes 49 EPAs that are distributed through the 4 stages of residency training (Table 1).13 Most of these EPAs are designed as observable clinical encounters. The Accreditation Council for Graduate Medical Education (ACGME) Milestones Project described the anesthesiology competencies to provide a nationally shared framework. However, the project did not describe the methods to assess milestones in practice, nor how to determine whether the milestones have been achieved. More recently, 20 EPAs that map to the ACGME anesthesiology milestones have been rigorously developed to facilitate resident assessment by anesthesiology training programs in the United States.11 Table 1. - Four Stages of Postgraduate Training in Competence by Design and Associated Number of Anesthesiology EPAs, Mapped to Traditional Training Competency by design stages Mapping to traditional PGY in training Number of anesthesiology EPAs Transition to discipline PGY 1 3 EPAs Foundations of discipline PGY 1–2 16 EPAs Core of discipline PGY 3–5 25 EPAs Transition to practice PGY 5 5 EPAs Abbreviations: EPA, entrusted professional activity; PGY, postgraduate year.Adapted from the Royal College of Physicians and Surgeons of Canada.13 One critical enabler to CBME is the workplace-based assessment (WBA). By observing residents in the complex and authentic clinical practice environment, WBA tools serve as an important measure to monitor their training and document their achievement of competencies. WBAs have incorporated entrustment scales, which focus an assessor’s judgment on progression toward independent practice by asking whether a resident is capable of completing a task at a defined level of supervision. By anchoring a faculty member’s observation of a resident’s performance to a required level of supervision, the reliability between assessors is improved.14 Several entrustment-based WBA scales, with robust validity data for anesthesia training, have been developed to assess the perioperative care competencies of residents and monitor their progress during training.15–17 Ideally, CBME is delivered with numerous WBAs, including detailed and constructive narratives. These formative assessments (assessments for learning) serve to provide frequent targeted feedback and tailored coaching to residents, which guide their development of competencies and support mastery learning.18 Individual assessments, with quantitative performance ratings and qualitative narratives, are also collected over time for summative purposes (assessments of learning).19 Competence committees require sufficient aggregated data to make summative judgments of the residents of their program, such as promotion through a training program and documenting progress toward readiness for independent practice. Early reports suggest that CBME trainees are receiving more constructive feedback and that CBME assessment systems are also providing training programs with more information to make summative decisions about their residents.20 WHAT APPEARS TO BE WORKING WITH CBME? Given these early days of CBME adoption, there is limited information available regarding the experiences and perceptions of residents and faculty supervisors. A survey from 2016 indicated that anesthesiology residents in the United States were satisfied with their CBME training programs and were confident that they would master the ACGME competencies.21 CBME residents from several Canadian programs, including anesthesia, identified that they valued frequent meaningful feedback from supervisors through formative assessments, as this information helped to modify their behaviors.22 Interestingly, anesthesiology residents prefer specific supervisor feedback comments over a rating of their performance on an entrustment scale. They explained that the qualitative feedback shifted their orientation from a focus on achievement to one of self-improvement.23 This result is encouraging, as it suggests that residents are embracing a growth-oriented mindset,18 which is a driving goal of CBME. One year after the implementation of CBME, a report from a single Canadian anesthesiology program suggested that faculty had a favorable view, perceiving that the curriculum improved the performance of junior residents and clarified expectations.24 Faculty also appreciated the resident-directed and centered-learning process, as they noticed residents seemed motivated, “taking their learning in their own hands and presenting it to staff.”24 WHAT AREAS IN CBME NEED MORE WORK? A competency-based assessment of a trainee is complex and requires multiple approaches to ensure a valid reflection of their performance. The exponential increase in the number of assessments required to support CBME has created challenges for residents, faculty, and programs. The duality in the purpose of assessments (formative versus summative) can foster tensions, as stakeholders may view the purposes and stakes of the “point in time” WBA observations differently. Residents can perceive any, or all, formative assessments, which provide input into competence committee decisions as summative and are therefore high stakes.23 This viewpoint can make them reluctant to engage with the assessment results as a learning opportunity. One unintended consequence of this perception is that trainees might seek out lenient assessors or seek to perform simpler cases to demonstrate their competencies. Such an approach can interfere with the intended effect of supporting a resident’s learning and growth.25 Furthermore, EPAs may not be perceived as low stakes when they result in low ratings or when a minimum number of assessments are required to ensure advancement through the training program. Anesthesiology residents in Canada have expressed concerns that EPAs and the associated entrustment scales contribute to a “tick box exercise,” in which they are simply attempting to acquire assessments.23 In addition, anesthesiology residents reported that the quantitative entrustment scales failed to provide them with meaningful information about how they were progressing during training.23 Regrettably, some residents find it onerous to obtain feedback or an explanation for performance ratings.22 Finally, the burden of WBAs has been described to increase workloads, disrupt clinical workflow, and foster assessment fatigue in learners and their faculty.22,23,26,27 All of these challenges threaten the intended growth mindset of CBME. Faculty, who likely trained in programs that used a time-based paradigm, have expressed concerns about the lack of evidence to support the shift to CBME.24 In surveys, faculty have acknowledged an apprehension about the considerable amount of resources that are necessary to successfully implement and manage CBME.24 Furthermore, they recognized a mismatch in the perception of the goals and responsibilities in a CBME model compared to residents.24 CBME represents a culture change for both faculty and learners, with both groups benefiting from a growth-oriented mindset and a partnership approach to assessments.28 Faculty development is critical to appreciate that the assessment and coaching that concomitantly occur during WBAs are not discrete and independent events. Rather, they should be seen as overlapping and interdependent processes to support resident growth. As mentioned above, entrustment scales are anchored to the required level of supervision. They aim to reduce assessor bias and may require less faculty training than a subjective scale that asks raters to make assessments against “meeting expectations” for postgraduate years.14 Nevertheless, a recent study suggests that faculty continue to align their assessments to preexisting cognitive benchmarks, such as a resident’s year of training, rather than the observed clinical performance.29 Even when assessments are viewed as having no consequence for a resident’s progression in training, anesthesia faculty are often unwilling to identify competency deficits, despite patient safety having been demonstrably compromised.30 Moreover, >25% of anesthesiology residents received the same rating across the ACGME milestones, a phenomenon described as “straightlining,” compared to <10% straightlined response among other specialties, providing better discrimination between their trainees.31 Analysis of longitudinal anesthesiology milestones data in 5 US programs demonstrated that the frequency of straightlining varied significantly by program, from 9% to 57%.32 While this could represent a true lack in variation in resident competence, straightlining conflicts with the CBME concept that trainees progress at different stages for various competencies. Rather, straightlining may suggest that individualized resident assessments are not occurring or that programs lack a reliable method for assessing specific milestones. It could also indicate that Clinical Competency Committees are influenced by the halo effect and applying an overall impression, or that milestone level is still being assessed based on the year of training. Programs should be exploring their assessment and competence rating culture to examine overall rating patterns, thereby searching for root causes to identify contextualized local solutions. Rater errors and biases can have serious implications for the validity and defensibility of the eventual high-stakes summative judgments made using these assessments. Thus, training programs and competence committees should attempt to identify their outlier faculty with leniency or stringency biases. CBME resident assessment data can also be reconsidered as a measure of faculty performance. Auditing the quality and quantity of the assessments provided to residents can provide context and feedback for faculty.12,20 Training of faculty assessors may be a way to reduce rater bias and encourage more meaningful learner assessments. After a short educational intervention for academic anesthesiologists, the quality of their feedback provided to a simulated anesthesiology resident significantly improved, and professionalism lapses were better addressed.33 Disappointingly, studies in nonanesthesia programs have reported an inconsistent impact of rater training, unable to inform evidence and best practices for optimal format or duration of an intervention.34 The shift to CBME was partly driven by a movement for public accountability in medicine. Accordingly, the high-stakes decision of confirming an anesthesiology resident’s readiness for independent clinical practice requires robust evidence.19 To allow competency committees to make summative decisions, a resident’s education portfolio should include workplace assessments completed by a variety of faculty across a wide breadth of clinical contexts.12 Unfortunately, programs are struggling to use the assessment data received for these purposes.20 In the absence of platforms to manage and interpret the realms of data, programs can be challenged to support the growth of their residents and to make confident valid decisions on readiness for practice. A central goal of competency-based, time-variable medical education is graded increases in clinical and professional responsibility by entrusting trainees to provide patient care without supervision. Regrettably, the realization of a fully competency-based, time-variable system is constrained by current licensing and certification requirements and program accreditation systems that rely on fixed durations of training. Consequently, we still primarily rely on high-stakes, standardized, summative examinations for anesthesiology certification. Modernization of this approach could permit that evidence of a resident’s achievement of certain competencies be used to grant a graduated license or microcertification to allow them to take on greater clinical autonomy.35,36 Logistically, this would require close collaboration and innovation between programs and various regulatory bodies. WHAT WORK NEEDS TO BE DONE FOR CBME? Entrustment decisions are of particular importance in anesthesiology, in which patient care often occurs in high-stakes environments. In these dynamic, complex, acute, and rapidly changing situations, WBAs may not always be possible or appropriate. Thus, there are advantages to increasing the diversity of CBME assessment modalities. Simulation can be used to provide additional assessment data of technical and nontechnical skills.37 Simulation can also be beneficial when certain clinical events are critical but rare, and when it is not appropriate or safe to delegate responsibilities to a trainee solely for the purpose of undertaking an assessment.38 In anesthesiology and emergency medicine training, in which patient care is also acute and dynamic, simulation-based milestone assessments have been shown to be valid measures of resident competency and correlate with clinical evaluations.39–41 Relatedly, the performance on a regular objective structured clinical examination (OSCE)-based milestone assessment can also provide evidence of longitudinal growth in competence in anesthesiology residents.42 In Canada, the national CBME curriculum includes 5 mandatory, standardized mannequin-based simulation scenarios.43 Although the impact of simulated performance outcomes on certification decisions has yet to be determined, every Canadian anesthesiology resident must undertake these simulation scenarios before completing their training. Similarly, artificial intelligence is now being explored to enhance resident assessment through testing knowledge and observing workplace performance. For example, by training machines to understand speech patterns such as time spent talking or interruptions, a resident could receive specific feedback about their interpersonal communication directly after seeing a patient.44 Machine learning is also being explored to provide learning analytics to aggregated CBME assessment data to analyze and predict a trainee’s EPA progress to support residents and programs.45 As our experience deepens with CBME, more work is needed to demonstrate that the shift to CBME improves trainee performance and patient outcomes. Since operating room metrics are often used to assess the quality of patient care, one anesthesiology program in the United States examined perioperative databases to assess the efficiency of resident performance. This demonstrated that patient emergence time decreased by 28 seconds for each year of training.46 While this difference was statistically but not likely clinically significant, it serves as an example of exploring patient outcomes as a measure of competency. Patients are also increasingly involved in competency-based resident assessments, most often in the form of multisource feedback.47 Compared to physicians, who focus on medical expertise, patients provide greater comments on resident professional behaviors and communication skills in their assessments.47 Given the required investment by all stakeholders for successful CBME implementation, meaningful patient-focused outcomes still need to be identified to ensure a demonstrable return on that investment. In view of the exponential growth in medical knowledge and evolving patient needs, CBME must be agile and responsive. Nearly a decade after the original Milestones Project, the ACGME framework was recently revised to resolve some mismatch between the competencies and expectations for current and future graduates. The Anesthesiology Milestones now include previously absent critical skills and also reflect advances in clinical practice, such as point-of-care ultrasound (POCUS) and electronic medical records (Table 2).48 Table 2. - Six Domains of Competency in Anesthesiology Milestones 2.0, and Associated Subcompetencies Patient care PC 1: Preanesthetic evaluation PC 2: Perioperative care and management PC 3: Application and interpretation of monitors PC 4: Intraoperative care PC 5: Airway management PC 6: Point-of-care ultrasound PC 7: Situational awareness and crisis management PC 8: Postoperative care PC 9: Critical care PC 10: Regional (peripheral and neuraxial) anesthesia Medical knowledge MK 1: Foundational knowledge MK 2: Clinical reasoning Systems-based practice SBP 1: Patient safety and quality improvement SBP 2: System navigation for patient-centered care SBP 3: Physician role in health care systems Practice-based learning and improvement PBLI 1: Evidence-based and informed practice PBLI 2: Reflective practice and commitment to personal growth Professionalism PROF 1: Professional behavior and ethical principles PROF 2: Accountability/conscientiousness PROF 3: Well-being Interpersonal and communication skills ICS 1: Patient- and family-centered communication ICS 2: Interprofessional and team communication ICS 3: Communication within health care systems Abbreviations: ICS, interpersonal and communication skills; MK, medical knowledge; PBLI, practice-based learning and improvement; PC, patient care; PROF, professionalism; SBP, system-based practice.Adapted from the Accreditation Council for Graduate Medical Education.9 More nuanced, but perhaps more important to all specialties, is the increasing evidence of racial disparities that exist across the spectrum of health care. Black, Indigenous and people of color (BIPOC) encounter barriers to access surgical procedures, while also having higher rates of maternal mortality and worse surgical outcomes.49,50 As we continue to examine and face the legacy and persistence of structural racism in health care, it is clear that medical education institutions have an important role to play. Programs must be invested in training anesthesiologists with the competencies to provide effective and equitable care to BIPOC and other marginalized populations. It behooves us to iteratively revise CBME curriculum to reflect advances in practice (eg, POCUS). Analogous curriculum updates are also required to address social determinants of health (eg, health inequities in perioperative and maternal care for racialized persons) and to support antiracism in medical education.51,52 Finally, no discussion of CBME is complete without considering the abandonment of time as the framework of postgraduate training. If nothing else, the disruption caused by the global pandemic has served to clarify the need to innovate and utilize the opportunity of CBME for flexibility in training time and individualized curricula based on competence. Programs and accrediting and licensing bodies may need to consider the suppleness of time offered by CBME. This would allow them to capitalize on adding competent anesthesiologists to the workforce when they are ready for independent practice as opposed to a time-based certification date.53 CONCLUSIONS Twenty years ago, CBME was going to dramatically improve medical education to address unsafe, inefficient, and poor-quality health care, all while reforming training systems to ensure that physicians have the skills needed for complex modern practice. The medical education system has since undergone an accelerated transformation. No one will dispute that implementing CBME in anesthesiology programs has proven to be been challenging. Nevertheless, it has provided an explicit and transparent educational framework for residents, faculty, and programs. More work is needed to iteratively build on our early successes and to determine whether CBME produces more competent anesthesiologists to benefit patients and society. DISCLOSURES Name: Alayne Kealey, MD, FRCPC. Contribution: This author helped with the literature search and design, draft, and revisions of the manuscript. Conflicts of Interest: None. Name: Viren N. Naik, MD, MEd, MBA, FRCPC. Contribution: This author helped with design and revisions of the manuscript. Conflicts of Interest: V. N. Naik is an employee of the Royal College of Physicians and Surgeons of Canada. This manuscript was handled by: Thomas R. Vetter, MD, MPH.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call