The Overarching Themes From the CLER National Report of Findings 2018.
The Overarching Themes From the CLER National Report of Findings 2018.
- Research Article
25
- 10.4300/1949-8349.10.4s.49
- Aug 1, 2018
- Journal of Graduate Medical Education
This section includes detailed findings from the second set of visits (2015–2017) of the Clinical Learning Environment Review (CLER) Program. The findings in the 6 CLER Focus Areas1 are based on site visits to the major participating clinical sites (ie, hospitals and medical centers) for 287 Accreditation Council for Graduate Medical Education (ACGME)-accredited Sponsoring Institutions (SIs) with 3 or more core residency programs.23 These clinical sites serve as clinical learning environments (CLEs) for the SIs.Collectively, the 287 SIs oversee 9167 ACGME-accredited residency and fellowship programs, with a median of 20 programs per SI. These larger SIs account for 87.1% of all residents and fellows in ACGME-accredited programs—with a range of 17 to 2156 trainees per SI (median = 246).Approximately 28% of the CLEs were located in the Northeast region of the United States, 30.3% in the South, 26.5% in the Midwest, and 14.6% in the West. The sites ranged in size from 107 to 2654 acute care beds (median = 528). The majority (67.2%) were nongovernment, not-for-profit organizations; 23.3% were government, nonfederal; 5.9% were investor-owned, for-profit; and 3.5% were government, federal. Although the CLER teams spent the majority of their time at inpatient settings, they also sometimes visited affiliated ambulatory care practices in close proximity.In total, the CLER teams interviewed more than 1600 members of executive leadership (including chief executive officers), 9262 residents and fellows, 8164 core faculty members, and 6034 program directors of ACGME-accredited programs in group meetings. Additionally, the CLER teams interviewed the CLEs' leadership in patient safety and health care quality and thousands of residents, fellows, faculty members, nurses, pharmacists, social workers, and other health care professionals while on walking rounds in the clinical areas.As previously described in the CLER National Report of Findings 2016,4 these findings are based on a mixed methods approach to data gathering and analysis to improve the accuracy of the findings by combining quantitative, descriptive, and qualitative evidence in a complementary manner. As such, some of the findings are represented quantitatively while others are described qualitatively.The combination of methodologies and varied representation of findings should be considered when interpreting the results, making comparisons, or drawing conclusions. Both supporting and conflicting evidence may be presented to explain or qualify findings. For example, results from the group interviews may appear more positive than information gathered on walking rounds. Alternatively, practices reported during group interviews may have been verified on walking rounds.During the group interviews with residents and fellows, faculty members, and program directors, an electronic audience response system (ARS; Keypoint Interactive version 2.6.6, Innovision Inc, Commerce Township, MI) was used to collect anonymous responses to closed-ended questions. The results from the ARS were analyzed at both the individual (eg, residents and fellows) and the CLE levels.At the individual level of analysis, results are presented as percentages of the total number of individuals surveyed. For example:At the CLE level of analysis, individual responses were aggregated at the CLE level and results are presented as median and interquartile range (IQR) percentages. For example:Statistically significant differences (ie, P ≤ .05) in responses due to resident and fellow characteristics (eg, residency year) and CLE characteristics (eg, bed size) are also reported. Of note, statistical significance does not always imply practical significance. For example, differences in responses by residency year may be statistically significant but the differences may not be meaningful or large enough to have practical relevance or implications.As described in the Methodology section,5 this report contains a specific set of descriptive terms that summarize quantitative results from both the ARS and specific findings that were quantified from the site visit reports. These terms and their corresponding quantitative ranges are as follows:Besides the quantitative data, this report contains qualitative data from a number of open-ended questions that CLER Site Visitors asked during group interviews and walking rounds. This information, by design, was not intended to be enumerated. For these questions, the site visit teams made an assessment of the relative magnitude of observations at each individual site. To prevent confusion, these results are presented in the report using a set of descriptive terms different from the previously described terms used for quantitative data. The qualitative descriptive terms, which are intended to approximate the quantitative terms above, are as follows:Finally, this section follows approximately the same structure as the individual CLER Site Visit reports received by participating institutions. This structure is intended to facilitate easy comparison between data from an individual site and that of this report, which aggregates results from all 287 SIs. Those who seek additional detail may consult the Appendices (p. 81–124). Appendix A contains additional information on the SIs, sites visited, and groups interviewed, Appendix B contains selected aggregated quantitative results from the group interviews with residents and fellows, and Appendix C contains qualitative information from the group interviews and walking rounds.The CLER Program explored several aspects of resident and fellow engagement in patient safety with emphasis on 5 major topics: culture of safety, use of the patient safety event reporting system, knowledge of patient safety principles and methods, inclusion in patient safety event investigations, and disclosure of patient safety events. Generally across CLEs, members of the executive leadership team identified patient safety as their highest priority area for improvement.The patient safety and quality leaders in many CLEs indicated that they periodically conduct a culture of safety survey that includes residents, fellows, and faculty members. Overall, 97.7% of the residents and fellows in the group interviews reported that their CLE provides a safe and nonpunitive environment for reporting errors.Across CLEs, physicians and other staff members also reported use of the patient safety event reporting system to report on individual behaviors. This use included reporting on behaviors in a retaliatory fashion or in a manner that could be perceived as punitive.Given this and based on the collective findings from the site visits, it is unclear as to whether residents, fellows, and other staff members perceived a safe and nonpunitive culture for reporting patient safety events.Overall, CLEs had 1 or more mechanisms for reporting patient safety events, including an online or paper-based patient safety event reporting system, a chain-of-command system that allowed events to be reported to an immediate supervisor (eg, a more senior resident or faculty member), and a mechanism to verbally report events to the patient safety staff (eg, hotline).In general, residents and fellows appeared to be aware of their CLE's process for reporting patient safety events such as adverse events, near misses/close calls, and unsafe conditions. During walking rounds, the CLER Site Visit teams also asked nurses about their CLE's patient safety event reporting system. Across nearly all CLEs (97.2%), nurses appeared to be familiar with their CLE's system for reporting patient safety events.Approximately 78% of CLEs were able to provide information on the number of patient safety event reports submitted by residents and fellows (see Appendix C1), and 70.7% were able to provide the number of patient safety event reports submitted by attending physicians. The remaining CLEs indicated that their system did not track such information. Whereas CLEs occasionally provided the Graduate Medical Education Committee and their governing body with information on the number or percentage of patient safety event reports submitted by residents and fellows, it was less common for them to routinely report the number or percentage of patient safety event reports submitted by faculty members to these same groups.Generally across CLEs, the residents and fellows interviewed on walking rounds appeared to lack understanding and awareness of the range of reportable patient safety events, including what defines a near miss/close call. In most CLEs (83.6%), nurses' understanding of reportable patient safety events also varied (see Appendix C2).Across CLEs, residents, fellows, and nurses appeared to focus on reporting sentinel events, medication errors, patient falls, and other events with harm; they did not appear to recognize near misses/close calls, unsafe conditions, events without harm, unexpected deteriorations, or known procedural complications as reportable patient safety events. Residents, fellows, and nurses appeared to have little awareness of the importance of reporting these events and how such reporting can provide valuable information for identifying system failures, addressing vulnerabilities in the system, reducing risks, and improving patient safety.Overall, 72.7% of the residents and fellows in the group interviews indicated that they had experienced an adverse event, near miss/close call, or unsafe condition while at their CLE. This experience varied by gender, year of training, and specialty grouping (see Appendix B1).Of the residents and fellows who reported that they had experienced an adverse event, near miss/close call, or unsafe condition, 49.8% indicated that they had personally reported the patient safety event using the CLE's patient safety event reporting system. Responses varied by gender, year of training, and specialty grouping. Across CLEs, the median (IQR) finding was 50.0% (37.5%–66.7%) and varied by region, CLE bed size, and type of ownership (see Appendix B2). For those who did not personally enter the patient safety event into the system, 13.6% indicated that they relied on a nurse to submit the patient safety event report, 24.4% indicated that they relied on a physician supervisor, and 12.1% indicated that they cared for the patient and chose not to submit a report.When faculty members and program directors in the group interviews were asked what process residents and fellows most frequently followed when reporting a patient safety event, 57.9% of the faculty members and 53.7% of the program directors indicated that they believed residents and fellows most often reported the event themselves using the CLE's patient safety event reporting system.In a separate query, 23.6% of the residents and fellows in the group interviews indicated that they had reported a near miss/close call event while at the CLE; responses varied by gender, year of training, and specialty grouping (FIGURE 1). Across CLEs, this finding ranged from 0% to 100%, with a median (IQR) of 23.1% (15.2%–33.3%); responses varied by region and type of ownership (see Appendix B3).On walking rounds, residents and fellows in many CLEs mentioned that they often report patient safety events locally or through their chain of command while also indicating familiarity with the patient safety event reporting system and its use. When they delegated or relied on others to report, it was unclear if these reports were formally captured in the CLE's centralized patient safety event reporting system. Residents and fellows mentioned the cumbersome process of submitting a report, the time needed to enter a report, fears of repercussion, and the uncertainty of receiving feedback as reasons for not reporting. The collective information from the site visits indicated that in 70.6% of the CLEs, resident and fellow reporting of patient safety events into the CLE's patient safety event reporting system was varied or infrequent (see Appendix C3).In the group interviews, the CLER teams also explored faculty members' and program directors' use of the CLE's patient safety event reporting system. Approximately 36% of the faculty members reported that they had personally reported an adverse event, near miss/close call, or unsafe condition in the past year (median [IQR], 35.7% [26.0%–46.6%] across CLEs). Among the program directors, 35.9% reported that they had personally reported an adverse event, near miss/close call, or unsafe condition in the past year (5.5% had no clinical responsibilities at the site). Across CLEs, the median (IQR) finding was 36.0% (27.3%–50.0%). In both groups, responses varied by CLE bed size and type of ownership.In the group interviews, the CLER teams asked residents and fellows whether they received feedback on patient safety event reports. Of those who had experienced an adverse event, near miss/close call, or unsafe condition and who had personally submitted a patient safety event report or relied on a nurse or supervisor to submit the report, 46.1% reported that they received feedback on the outcome of the report. Responses varied by gender, specialty grouping, and year of training (FIGURE 2; see also Appendix B4).Residents and fellows often mentioned receiving an e-mail acknowledging receipt of the patient safety event report. They also noted receiving requests for additional information as part of a formal patient safety event investigation. It was uncommon for residents to mention receiving information on the outcome of the investigation, including recommended actions to address vulnerabilities in the system and to improve patient safety. Across CLEs, residents, fellows, nurses, and other clinical staff expressed a strong desire to receive feedback in response to submitting a patient safety event report.Overall, CLEs varied in their processes for reviewing and prioritizing patient safety events. Residents and fellows also varied in their knowledge of these processes and often used the term "black box," indicating that these processes were unclear. Many residents and fellows appeared to be unaware of how their CLEs use the reporting of adverse events, near misses/close calls, or unsafe conditions to improve care both broadly and at the individual departmental level. Residents and fellows were rarely involved in their CLE's process for reviewing and prioritizing patient safety events that required further investigation.On walking rounds, the CLER teams explored resident and fellow participation in the time-out process as part of patient safety practices (eg, ambulatory and bedside procedures). Across many CLEs, residents, fellows, nurses, and other health care professionals interviewed on walking rounds indicated that residents and fellows do not consistently conduct standardized time-outs before performing bedside procedures.Across most CLEs (91.6%), residents and fellows appeared to have limited knowledge of fundamental patient safety principles and methods (eg, Swiss cheese model of system failure, root cause analysis, fishbone diagrams; see Appendix C4).When asked to identify their skills in applying patient safety principles, the majority of the faculty members indicated that they were either proficient or expert (62.7% and 25.1%, respectively) in applying these skills. Similarly, most of the program directors reported themselves as proficient or expert (63.6% and 21.9%, respectively).Of the residents and fellows in the group interviews, 36.3% reported that they had participated in a structured interprofessional simulation activity related to patient safety. Responses varied by gender, year of training, and specialty grouping. Across CLEs, the median (IQR) finding was 37.1% (26.3%–50.0%), with responses varying by region and type of ownership.In many CLEs, the patient safety and quality leaders indicated that they did not track resident and fellow participation in patient safety event investigations (eg, root cause analysis). A limited number of CLEs provided the Graduate Medical Education Committee and the governing body with information regarding the number of residents and fellows who had participated in formal patient safety event investigations.The CLER teams also asked the program directors in the group interviews if they measured resident and fellow participation in patient safety event investigations. Approximately 42% of the program directors reported tracking resident and fellow involvement (median [IQR], 44.4% [30.0%–66.7%] across CLEs). Responses varied by region, CLE bed size, and type of ownership.In the group interviews, 37.6% of the residents and fellows who were postgraduate year 3 (PGY-3) and higher indicated that they had participated in an interprofessional investigation of a patient safety event that included components such as analysis of system issues, development and implementation of an action plan, and monitoring for continuous improvement. Reponses varied by specialty grouping (FIGURE 3). Across CLEs, the median (IQR) finding was 37.6% (28.6%–50.0%), with responses varying by region, CLE bed size, and type of ownership (see Appendix B5).The CLER teams also asked faculty members about their involvement in interprofessional patient safety event investigations. Approximately 64% of the faculty members in the group interviews reported that they had participated in an investigation of a patient safety event that involved physicians, nurses, administrators, and other health care professionals (median [IQR], 63.3% [53.0%–73.2%] across CLEs).Overall, the format and process of investigating patient safety events varied both across and within CLEs. It was uncommon for residents and fellows to describe involvement in comprehensive systems-based approaches to patient safety event investigations aimed at preventing future adverse events and sustaining improvements in patient safety. In general, residents and fellows described experiences that lacked the attributes of a formal patient safety event investigation with very little or no interprofessional or interdisciplinary engagement. Residents and fellows varied widely in their perceptions of what constituted a formal investigation of a patient safety event. Across many CLEs, case conferences, morbidity and mortality conferences, and grand rounds continued to be the major approach to patient safety event investigations.Faculty members and program directors indicated that departmental mortality conferences, case conferences, and online modules were other informal approaches to model elements of a patient safety event investigation.In the group interviews, 66.0% of the residents and fellows indicated that they had received training on disclosing medical errors to patients and/or families (4.5% reported that such training was not applicable). Responses varied by year of training. Across CLEs, the median (IQR) finding was 68.2% (57.1%–79.3%), with responses varying by region and CLE bed size. Of those who received training, 10.1% indicated that the training was primarily simulation based; 69.8%, didactic and/or online; 15.1%, informal; and 5.0%, other.Approximately 82% of the residents and fellows in the group interviews indicated that they knew of CLE resources to assist them in coping with a major patient safety event that resulted in a patient death (median [IQR], 85.8% [74.7%–93.0%] across CLEs; see Appendix B6 for information on variability). Of those familiar with the resources, most indicated that they would be somewhat (39.8%) or very comfortable (44.7%) in using these resources.The CLER Program explored resident and fellow engagement in improving health care quality within the context of 6 major areas: involvement in developing and implementing the CLE's strategies for health care quality, awareness of the CLE's health care quality priorities, knowledge of health care quality terminology and methods, engagement in quality improvement (QI) projects, access to quality metrics data, and engagement in CLE efforts to address health care disparities.As part of understanding the CLE's approach to improving health care quality, the CLER Site Visit teams reviewed the organization's strategic plan for quality and interviewed both executive and patient safety and quality leaders. Overall, a limited number of CLEs appeared to integrate QI within the organization as part of a system-wide, comprehensive approach to promote experiential learning and to improve quality and safety across the organization.Across CLEs, resident and fellow involvement in strategic planning for QI was uncommon. Residents and fellows often served as implementers of CLE-wide QI activities (eg, hand hygiene, reducing hospital-acquired infections, reducing 30-day readmissions).A limited number of CLEs had instituted resident and fellow committees aimed at increasing resident and fellow engagement in QI; few of these committees were integrated into the CLE's formal QI processes. In many CLEs, resident and fellow participation in institutional QI committees was uncommon; often, roles and expectations for participation were undefined or unclear. The clinical sites also appeared to have insufficient structure to allow residents and fellows to attend committee meetings regularly and to participate in meaningful ways. Additionally, residents and fellows in many CLEs were not included in the governing body's patient safety and quality committees.In general, priorities for improving health care quality varied across CLEs. However, some common themes included alignment with broad national priorities such as Centers for Medicare & Medicaid Services value-based purchasing, Core Measures, or publicly reported performance measures. Many were also highly focused on meeting specific criteria such as reducing 30-day readmissions or improving performance on metrics related to pneumonia, chronic heart failure, and surgical care improvement project measures.In the group interviews, 78.8% of the residents and fellows (PGY-2 and above) reported knowing their CLE's priorities for improving health care quality (see Appendix B7 for additional information on variability). When asked the same question, 84.4% of the faculty members and 86.7% of the program directors reported knowing the priorities. Often, the physician groups focused on departmental activities and did not describe priorities that aligned with those identified by the CLE's executive leadership or the patient safety and quality leaders. When the physicians identified priorities aligned with those of executive leadership, they were most commonly around nationally recognized measures, especially those related to programs with financial incentives such as measures from the Centers for Medicare & Medicaid Services.In 55.1% of the CLEs, the residents and fellows appeared to have limited knowledge or understanding of basic QI terminology and methods such as Lean, Plan-Do-Study-Act, and Six Sigma (FIGURE 4, see also Appendix C5). A limited number of residents and fellows could articulate the QI approach employed by their CLE in designing and implementing QI activities to improve patient care.In general, the approach to educating residents and fellows about health care QI varied both within and between CLEs. Although some type of education was common as part of new resident and fellow orientation, a limited number of CLEs aimed to provide ongoing training for all residents and in health care QI appeared to primarily within or medical education programs, and the methods, and appeared to of the CLEs, the patient safety and quality leaders indicated that they resident and fellow QI the group interviews with residents and fellows (PGY-2 and reported they had participated in a QI project of their design, or by their program or Of this reported that their QI project was to 1 or more of the CLE's 23.3% were Of those who reported their QI were to the CLE's reported their involved interprofessional Appendices and provide detailed information on the group interviews and on walking rounds, the CLER teams asked residents and fellows to describe their QI Overall, residents and fellows varied in their of these It was uncommon for residents and fellows to describe that aligned with their CLE's priorities. In most CLEs few described that included the components of a QI (ie, (FIGURE see also Appendix Often, resident and fellow participation was limited to planning and implementing a QI For many residents and fellows, their QI did not formally and designing actions to and ongoing QI was also uncommon for residents and fellows to describe involvement in interprofessional QI During the interviews on walking rounds, a limited number of nurses and other health care professionals indicated that they were involved in interprofessional QI that included residents and the CLER teams faculty members in the group interviews about their engagement in interprofessional QI projects, 72.7% reported that they had participated in a QI project with nurses, pharmacists, and other members of the health care team (median [IQR], across the group interviews, of the program directors reported that their residents and fellows have access to for and data for the of health clinical and or national quality were often reported as common of QI data. Residents and fellows often mentioned the (eg, in specific reports from these data Many faculty members noted that residents and fellows had limited for data When it was often a departmental The type and of to residents and fellows varied both within and across of the residents and fellows in the group interviews reported receiving aggregated or QI data on their Responses varied by gender, year of training, and specialty grouping. Across CLEs, the median (IQR) finding was with responses varying by region, CLE bed size, and type of ownership (FIGURE see also Appendix the patient safety and quality leaders indicated that residents and fellows receive QI data to the care of their patients with others served by their clinical many CLEs, executive leaders were aware of of health their Many described to improve access to care and or care and for the often by residents and fellows from a few core (eg, and A limited number of residents and fellows from other specialty and programs reported in these limited number of executive leaders to health care within their or medical Overall, less than of executive leaders described a specific set of strategies or a approach to and in the care provided to or the clinical of their patient at for health care In approximately of the CLEs, the executive faculty members, or program directors indicated that some were data or related to health care specific patient many of these efforts were reported as the group interviews, 55.1% of the residents and fellows reported that they knew their CLE's priorities in addressing in health responses varied by year of training and specialty grouping. Across CLEs, this finding ranged from to (median [IQR], Responses varied by region, CLE bed size, and type of ownership (see Appendix In of the faculty members and of the program directors reported that they knew their CLE's priorities with to health care residents, fellows, faculty members, and program directors interviewed in the group interviews were able to describe at for health care at their clinical the group interviews, of the residents and fellows reported that they had received training that was specific to at for health care at their clinical reported receiving training that was not specific to the CLE's patient reported receiving training that was primarily informal while clinical and indicated that they had not received training at their CLEs, a median (IQR) of of the residents and fellows indicated that they had received training that was specific to at for health care at their clinical site. Responses varied by region, CLE bed size, and type of ownership (FIGURE see also Appendix interviews on walking rounds, many residents and fellows described education and training in that was and not specific to the
- News Article
3
- 10.4300/jgme-d-21-01177.1
- Feb 1, 2022
- Journal of Graduate Medical Education
Pursuing Excellence: Innovations in Designing an Interprofessional Clinical Learning Environment.
- News Article
5
- 10.4300/jgme-d-16-00315.1
- Jul 1, 2016
- Journal of Graduate Medical Education
Early Impressions of the CLER Program: A Survey of the Designated Institutional Official Community.
- Research Article
1
- 10.4300/1949-8349.10.4s.77
- Aug 1, 2018
- Journal of graduate medical education
Lessons Learned and Future Directions: CLER National Report of Findings 2018.
- News Article
1
- 10.4300/jgme-d-21-00793.1
- Oct 1, 2021
- Journal of graduate medical education
CLER Pursuing Excellence: Faculty Development Innovations in Quality, Safety, Equity, and Value.
- News Article
7
- 10.4300/jgme-d-22-00490.1
- Aug 1, 2022
- Journal of Graduate Medical Education
Program Directors Patient Safety and Quality Educators Network: A Learning Collaborative to Improve Resident and Fellow Physician Engagement.
- Research Article
5
- 10.4300/1949-8349.10.4s.13
- Aug 1, 2018
- Journal of Graduate Medical Education
The Accreditation Council for Graduate Medical Education (ACGME) established the Clinical Learning Environment Review (CLER) Program12 to provide graduate medical education (GME) leaders and executive leaders of hospitals, medical centers, and other clinical settings with formative feedback in the 6 CLER Focus Areas.3 This feedback is aimed at improving patient care while optimizing the clinical learning environment (CLE). This report details findings from the second set of CLER Site Visits, which the CLER Program conducted from March 31, 2015, to June 10, 2017.The aggregated findings in this report reflect a mixed methods approach (ie, both quantitative and qualitative information gathering and analysis), which was used by the CLER Program to form a comprehensive base of evidence on how the nation's CLEs engage residents and fellows in the CLER Focus Areas.In addition to findings from the second set of CLER visits, this report includes an initial look at changes on a selected set of measures in each of the CLER Focus Areas since the first set of CLER visits (2012–2015). This 2-point analysis highlights both progress and challenges in CLEs. These findings can enhance and extend understandings of the complex and dynamic nature of CLEs and help inform conversations on how to continually improve physician training to ensure high-quality patient care within these learning environments.In 2015, there were 725 ACGME-accredited Sponsoring Institutions (SIs) and nearly 1800 major participating sites, which are the hospitals, medical centers, ambulatory units, and other clinical settings where residents and fellows train. This report contains findings from 287 CLEs that had 3 or more ACGME-accredited core residency programs. These CLEs were affiliated with 287 SIs that collectively oversaw 9167 residency and fellowship programs (89.1% of all ACGME programs) and 111 455 residents and fellows (87.1% of all residents and fellows in ACGME-accredited programs).a Appendix A provides additional information on the general characteristics of these SIs (eg, type of SI, number of programs) compared to all ACGME-accredited SIs.For SIs with 2 or more clinical sites that served as participating sites, the CLER Program visited 1 site due to resource limitations. This selection was based on 2 factors: (1) which CLE served the largest possible number of programs for that SI, and (2) whether that CLE had the availability of both the designated institutional official (DIO) and the chief executive officer (CEO) for the opening and exit interviews.The CLER Site Visit protocol included a structured schedule of events for each visit (FIGURE 1).CLER Program staff notified clinical sites of their CLER Site Visit at least 10 days in advance. This relatively short notice was intended to maximize the likelihood of gathering real-time information from interviewees.The number of site visitors and visit length varied according to the number of programs and residents and fellows at the site, with teams comprising 2 to 4 CLER Site Visitors and visits lasting 2 to 3 days. A full- or part-time salaried employee of the ACGME led each CLER Site Visit team. Additional team members included other CLER Site Visitors, ACGME staff, or trained volunteers from the GME community.For each site visit, the CLER Site Visitors conducted group interviews in the same order: (1) an interview with the DIOb; (2) an initial group interview with the CEO, members of the executive team (eg, chief medical officer, chief nursing officer), the DIO, and a resident representative; (3) a short interview with patient safety and quality leaders; (4) a group interview with residents and fellows; (5) a group interview with faculty members; (6) a group interview with program directors; (7) a second interview with patient safety and quality leaders; and (8) an exit meeting with the CEO, members of the executive team, the DIO, and a resident representative. Following specific guidelines, each clinical site provided the site visitors with a list of all individuals attending the group interviews before the site visit. The CLER team conducted all interviews in a quiet location without interruption and ensured that the interviews did not exceed 90 minutes.The purpose of the initial meetings with executive and patient safety and quality leaders was to allow the CLER team to become familiar with the basic language and culture of the CLE's current activities in the 6 CLER Focus Areas. This information helped inform subsequent interviews and observations during the CLER visit.The resident and fellow group interviews comprised 6 to 32 peer-selected participants per session. Specifically, residents and fellows at the SI, excluding chief residents, voted for their peers to attend the group interviews. The participants broadly represented ACGME-accredited programs at the clinical site with proportionally more individuals from larger programs. The CLER team primarily interviewed residents and fellows who were in their postgraduate year 2 (PGY-2) or higher to ensure that interviewees had sufficient clinical experience to assess the learning environment. PGY-1 residents in a transitional year residency program were permitted to attend.For the group interviews with faculty members and program directors, the CLER Program instructed the DIO to invite participants to attend the group interviews. In the faculty member group interviews, each session comprised 5 to 32 clinical faculty members who broadly represented the residency and fellowship programs at the CLE. Program directors were not permitted to attend the faculty member meetings. Group interviews with program directors comprised 3 to 32 leaders of ACGME-accredited core residency programs at each clinical site; sessions included associate program directors when program directors were not available.For CLEs with more than 30 programs, 2 separate sets of interviews were conducted with residents and fellows, faculty members, and program directors, with no more than 32 participants attending an individual session.Additionally, the CLER Site Visit team conducted a set of walking rounds, escorted by senior or chief residents and fellows, to observe various patient floors, units, and service areas. The CLER Program asked the DIO to select residents and fellows, preferably from a range of different specialties, to guide each CLER Site Visitor. Residents and fellows who participated in the resident and fellow group meetings or who served as the resident representative in the executive leadership meeting were not permitted to serves as escorts for the walking rounds.The walking rounds enabled the CLER Site Visit team to gather feedback from physicians, nurses, and other health care professionals (eg, pharmacists, radiology technicians, social workers) in the clinical setting. Each CLER Site Visitor conducted at least 3 sets of walking rounds per clinical site, with each walking round lasting 90 minutes. For larger CLEs, site visitors conducted an additional fourth walking round lasting 60 minutes.Throughout each visit, the CLER team conducted huddles to discuss the information they had gathered. Later during the visit, they held a team meeting to synthesize their findings, reach consensus, and prepare both an oral report and a draft of a written narrative report. At the exit meeting, the CLER team shared its oral report with executive leadership, which covered initial feedback on the 6 CLER Focus Areas. The written report, delivered approximately 6 to 8 weeks after the visit, reflected the same topics but with a more comprehensive and detailed set of observations. The intention of both the oral and written reports was to provide formative information that would help executive leadership assess their practices in the 6 CLER Focus Areas, inform resident and fellow training, and guide improvements in the CLE to ensure high-quality patient care.Survey instruments. To conduct the group interviews, the CLER Site Visitors used a structured questionnaire developed under the guidance of experts in GME and/or the 6 CLER Focus Areas. The questionnaires contained both closed- and open-ended questions. After the questionnaires were initially content validated by expert review, the CLER Program field tested the instruments on 4 CLER Site Visits. At the conclusion of each of these visits, the items were refined as part of an iterative design process; with each iteration, the CLER Program reviewed and revised the items as necessary based on feedback from interviewees and interviewers.Walking rounds. The CLER Program designed the walking rounds to facilitate random, impromptu interviews with residents, fellows, nurses, and other health care professionals across a number of clinical areas (eg, inpatient and outpatient areas, emergency departments) where residents and fellows were trained based on the SI's ACGME-accredited specialty and subspecialty programs.The aims of the walking rounds were to (1) triangulate, confirm, and cross-check findings from the group interviews and (2) glean new information on residents' and fellows' experiences across the 6 CLER Focus Areas. The walking rounds provided important information that could either confirm or conflict with the information gathered in group interviews.CLER Site Visit reports. The CLER Site Visitors synthesized findings from each visit in a written report, working from a formal template developed and refined in the early stages of the CLER Program. The template assisted the CLER Site Visit team in ensuring that each of the 6 CLER Focus Areas was fully addressed in the oral and written reports for each clinical site. The reports also included a brief description of the clinical site and any of its notable aspects. All members of the CLER Site Visit team reviewed and edited each report for accuracy and to achieve consensus on the findings.Other sources of data. Several other sources of data were used to augment the site visit data, including the ACGME annual data reportsc and the 2015 American Hospital Association (AHA) Annual Survey Database.d The ACGME reports provided information on the SIs, programs, and physicians in GME, including the number of ACGME-accredited programs, number of residents and fellows matriculated, and university affiliation. The AHA data offered CLE information, including type of ownership (eg, nongovernment, not-for-profit versus investor-owned, for-profit) and size, as measured by the number of staffed acute care beds.Group interviews with an audience response system (ARS). CLER Site Visitors conducted group interviews with residents and fellows, faculty members, and program directors using a computerized ARS (Keypoint Interactive version 2.6.6, Innovision Inc, Commerce, MI) that allowed for anonymous answers to closed-ended questions. The ARS data were exported into a Microsoft Excel spreadsheet and then into a software package for statistical analysis. Site visitors documented responses to open-ended questions qualitatively. The 3 surveys—1 each for residents and fellows, faculty members, and program directors—consisted of 45, 35, and 36 closed-ended questions and 25, 25, and 27 open-ended questions, respectively.Group interviews with no ARS. CLER Site Visitors documented all responses qualitatively for group interviews with the DIO (39 questions); with the CEO, members of the executive team, the DIO, and the resident representative (38 questions); and with patient safety and quality leadership (70 questions).Descriptive statistics. Descriptive statistics were used to summarize and describe distribution and general characteristics of SIs, CLEs, and physician groups interviewed. For SIs, characteristics included SI type (eg, teaching hospital, medical school) and the number of ACGME-accredited residency and fellowship programs per institution. CLE characteristics included type of ownership (eg, nongovernment, not-for-profit), number of licensed beds, and total staff count. Demographic information included gender and medical specialty of physicians who participated in the group interviews.Analysis of ARS data. Analyses were conducted at both the individual (eg, resident and fellow) and the CLE level. For the individual-level analyses, results are based on the total sample of individuals surveyed, presented as percentages. For CLE-level analyses, results show differences between CLEs after individual responses were aggregated at the CLE level and are presented as medians and interquartile ranges. These 2 levels of analysis provided a national overview of the state of CLE engagement in the 6 CLER Focus Areas and revealed how CLEs compared on these outcomes.Chi-square analysis was used to compare resident and fellow responses and to identify any relationships in responses by (1) gender; (2) residency year; and (3) specialty grouping. Chi-square analysis was also used to explore if differences were associated with the following CLE characteristics: (1) regional location; (2) bed size; and (3) type of ownership. Categories in the annual AHA survey informed grouping of CLE-specific variables (eg, bed size). P values of .05 or less were considered statistically significant. All statistical analyses were conducted using SPSS Statistics version 22.0 (IBM Corp, Armonk, NY).Analysis of CLER Site Visit reports. Specific findings based on responses to non-ARS questions and interviews on walking rounds were systematically coded in NVivo qualitative data analysis software version 11 (QSR International Pty Ltd, Doncaster, Victoria, Australia) following the principles of content analysis. Three members of the CLER Program staff, trained in qualitative data analysis, generated a master codebook through an iterative process by (1) independently applying codes to the data; (2) peer-reviewing coding; (3) discussing coding discrepancies; and (4) reaching agreement on the codes through consensus. The results were recorded as frequency counts for further descriptive analysis. Overall percentages and percentages stratified by CLE region, bed size, and type of ownership are reported.Two-point analysis of selected measures in the CLER Focus Areas. For this report, a selected set of measures in each of the CLER Focus Areas was examined to explore change over time for matched observations (ie, the same CLEs in both sets of visits). The final data set for this 2-point analysis comprised 242 CLEs; reasons for exclusion included health care system consolidations, changes in accreditation status (eg, voluntary withdrawal), changes in the number of core residency programs (eg, fewer than 3 core programs), and incomplete or missing data (see FIGURE 2). The measures examined for this section were the same in both sets of visits (eg, the questions remained constant between Cycle 1 and Cycle 2 of CLER visits).The Kolmogorov-Smirnov test was used to test for normality in the data. Based on the results of the Kolmogorov-Smirnov test and tests of symmetry, nonparametric tests were employed in the 2-point analysis. The Wilcoxon signed rank test (and the sign test when the data were nonsymmetrical) was conducted to compare changes in median percentage based on responses to closed-ended questions (ie, ARS data) that were aggregated at the CLE level. Based on coded extractions from the CLER Site Visit reports, the McNemar and marginal homogeneity tests were conducted to compare changes in the qualitative findings.P values of .05 or less were considered statistically significant. SPSS Statistics version 22.0 was used to conduct statistical analyses.Development of overarching themes and findings in the CLER Focus Areas. The overarching themes and findings by CLER Focus Areas were determined in 3 stages. First, the CLER Program staff asked each CLER Site Visitor to identify the overarching themes (ie, broad, high-level observations) and the challenges and opportunities in each of the CLER Focus Areas based on their summative experiences and observations through a key informant survey. The CLER Program staff systematically analyzed the content of all responses to discern common themes and note salient concepts. The approach to analysis was inductive in that the themes emerged from the content of the responses.Next, the CLER Site Visitors reviewed and commented on the results and offered additional findings by consensus. Based on feedback from the site visitors, the CLER Program staff revised the summary of results and presented them to the CLER Evaluation Committee. Lastly, the members of the CLER Evaluation Committee reviewed the results and developed a set of commentaries on the importance of the findings and their impact on patient care and physician training. The work of the committee was achieved by consensus.Use of terms to summarize quantitative and qualitative results. For the purposes of this report, a specific set of descriptive terms is used to summarize quantitative results from both the ARS and the site visit reports: few (< 10%), some (10%–49%), most (50%–90%), and nearly all (> 90%).The summary of qualitative data (ie, responses to open-ended questions during group interviews and conversations on walking rounds) is based on the site visitors' assessment of the relative magnitude of responses. The following set of terms is intended to approximate the quantitative terms above: uncommon or limited, occasionally, many, and generally.Triangulation of the findings enhanced overall accuracy in the conclusions. The findings were cross validated for consistency and corroboration using multiple sources of complementary evidence and analytic techniques. For example, the ARS results were more meaningful when supplemented by critical qualitative information and vice versa. Multiple sources of data provided greater insight and minimized inadequacies of individual data sources when a finding was supported in multiple places. This mixed methods approach provided a richer, more balanced, and comprehensive perspective by allowing for deeper dimensions of the data to emerge.As with any formative learning process, limitations to the CLER Program warrant consideration in using the information in this report. Perhaps most important, these findings do not suggest cause and effect.Second, although this aggregated set of findings is designed to be highly representative, it is based on a series of sampled populations and thus may not be generalizable to all CLEs. As previously mentioned, the CLER teams interviewed a sample of residents, fellows, faculty members, program directors, and other clinical and administrative staff for each visit—with the aim of broad representation across all programs (eg, proportionally more individuals from larger programs). Although the goal was to achieve a broad degree of representativeness, the sample may or may not reflect the entire population. Given that the CLER Program is a formative assessment, this approach to sampling allowed for a broad and in-depth understanding of socially complex systems such as CLEs. The CLEs that were not included in this sample may represent different experiences and consequently could yield different conclusions as CLER goes on to consider them in the future.
- News Article
2
- 10.4300/jgme-d-21-00946.1
- Dec 1, 2021
- Journal of Graduate Medical Education
CLER and COVID-19: A Formative Assessment of Clinical Learning Environments During the Pandemic.
- Research Article
2
- 10.4300/1949-8349.10.4s.11
- Aug 1, 2018
- Journal of Graduate Medical Education
Since its inception in 2012, the Clinical Learning Environment Review (CLER) Program has had at its core a commitment to formative assessment and feedback regarding graduate medical education (GME) engagement in 6 important, cross-cutting areas of focus—patient safety; health care quality; care transitions; supervision; fatigue management, mitigation, and duty hours; and professionalism. CLER's formative approach recognizes that, although there are shared elements, each hospital, medical center, and ambulatory care site that serves as a clinical learning environment (CLE) for resident and fellow physicians has a unique set of internal and external factors that influence the development and implementation of that CLE's strategic goals aimed at improving patient care.As in the CLER National Report of Findings 2016, the CLER Program continues to refer to CLEs as living and breathing entities—the embodiment of all of the individuals within these settings that influence and imprint upon these early learners. The CLER Program relies on the power of the information it provides to stimulate conversations and motivate CLEs to build upon their strengths and internally address opportunities for improvement.The CLER Program's primary link to accreditation is that every Sponsoring Institution (SI) must periodically complete a site visit. The 2016 National Report provided information on the CLER Program's background and structure.1In the second set of visits to SIs with 3 or more core residency programs, the structure of the site visit remained essentially unchanged. The site visitors met with GME and executive leadership and the CLE's leaders in patient safety and health care quality; held group interview sessions with residents and fellows, faculty physicians, and program directors; and had numerous conversations with various members of the clinical care team while on walking rounds within the CLE.For the second set of visits, the CLER Site Visitors used Protocol 2.0, which was similar but not identical to the version used in the first set of visits (ie, Protocol 1.0). Whereas the majority of the questions remained constant, Protocol 2.0 included new questions to explore important topics in greater depth. In addition, it included several other changes to enhance the quality of the information gathered as part of the CLER Program's commitment to a model of continual improvement:In Protocol 2.0, the CLER Evaluation Committee continued to provide oversight of and guidance for all aspects of program development. The committee is composed of members with expertise in patient safety and health care quality improvement, as well as GME and executive leadership of hospitals and medical centers (eg, chief medical officer, chief nursing officer). The committee also includes postgraduate physician representation and public members.For Protocol 2.0, the committee reviewed and provided guidance on the changes described in the previous section. They also reviewed the data resulting from the site visits and brought an external voice in response to the findings—presented here as overarching themes and challenges and opportunities. Their views and commentary on the significance of the overarching themes and the challenges and opportunities are reflected in the discussion sections of this report.This report includes data from the second set of visits to 287 participating sites of SIs accredited by the Accreditation Council for Graduate Medical Education with 3 or more core residency programs. Similar to the first report of findings, this report presents several different perspectives, including overarching themes, highlights of the challenges and opportunities in each of the 6 CLER Focus Areas, and detailed findings, as well as a new section that examines changes since the last set of CLER visits. The CLER Program will present the findings from the initial visits to SIs with 2 or fewer core residency programs in a separate report scheduled for release in 2019.
- Research Article
10
- 10.4300/1949-8349.9.6s.7
- Dec 1, 2017
- Journal of Graduate Medical Education
Before exploring the projected future of Sponsoring Institutions (SIs), it is important to understand their origins and development. Graduate medical education (GME) has relied on diverse learning communities to educate residents for unsupervised practice. As 1 such learning community, the SI has contributed to the education of residents and fellows by ensuring the provision of support systems, resources, and administrative structures. Through their oversight of GME, SIs have fostered clinical learning and working environments in which residents and fellows achieve educational milestones that indicate their ability to provide high-quality, safe patient care upon completion of their programs and throughout their professional lives. What follows is a brief history of institutional accreditation by the Accreditation Council for Graduate Medical Education (ACGME), the process by which we have come to recognize the SI as that community necessary to form physicians.In 1982, the newly formed ACGME published its “Essentials of Accredited Residencies in Graduate Medical Education,” which included a set of “General Requirements” that established standards and responsibilities for GME programs. Part I of these General Requirements began with the declaration that “[p]rograms in graduate medical education are sponsored by institutions engaged in providing medical care and health services. The principal institutions for graduate medical education are hospitals.”2 In Section 1, “Responsibility of Institutions,” the ACGME outlined standards that remain at the core of the essential relationships and processes expected of every institution that seeks to educate residents: commitment to sponsorship agreed upon by both faculty and administration; distribution of resources for educational purposes; establishment of institutional policies by which program directors exercised various responsibilities for their respective programs; periodic analyses of each program's effectiveness in meeting its goals and objectives; provision of facilities and resources to support education; and maintenance of hospital accreditation.3 It is important to note that, at the time, these standards were not new to GME; they were based on well-established groundwork laid by the ACGME's predecessors in program accreditation, namely, the American College of Surgeons, the American Board of Internal Medicine, the American Medical Association's Council on Medical Education, and the Liaison Committee on Graduate Medical Education. Each of these organizations had previously acknowledged the hospital's role as GME sponsor and had included some variation of these expectations in their respective requirements.To fulfill its accreditation mission, the ACGME delegated authority to specialty-specific Review Committees (RCs) that evaluated individual residency programs for compliance with both the General Requirements and specialty-specific program requirements. Although the ACGME's General Requirements acknowledged SIs and the role they played in GME, attention to how these institutions fulfilled their responsibilities was initially provided in the context of individual residency program review conducted by RCs. While these specialty-specific RCs made accreditation decisions independently of each other, each Committee's activity was reviewed periodically by the ACGME through its Monitoring Committee. As a result, in these earliest years of ACGME's development as an accrediting body, there was no process by which the institutions were reviewed separately from specialty programs, nor was there consideration of how a single institution demonstrated compliance with its responsibilities as specified in the General Requirements for all the programs it sponsored.Gradually, the ACGME and its RCs began to observe that noncompliance with program requirements often had, at least, some correspondence to areas of institutional responsibility identified in the General Requirements. In 1992, as a result of this increasing awareness, the ACGME approved a separate section of the General Requirements identified specifically as Institutional Requirements, along with a process for institutional review that was initially administered through the ACGME's Monitoring Committee. These Institutional Requirements elaborated on the standards that had been outlined in the original General Requirements. This initial institutional review process included a separate institutional site visit. On the basis of the site visit report and information provided in an Institutional Review Document, an institutional review administrator determined whether the SI had adequately documented and implemented policies in compliance with the General Requirements. The administrator then prepared a list of the institutions judged to be in compliance with the General Requirements. The Monitoring Committee reviewed documentation only from a representative sample of these institutions. Only 2 decision options were available. Institutions that received a favorable decision were given 5 years until their next review; institutions receiving an unfavorable decision were assigned a time frame for their next review, not to exceed 3 years. Three consecutive unfavorable decisions for an institution resulted in a loss of its ability to sponsor residency and fellowship programs.During this period, the growth of medicine was characterized in part by new specialties and subspecialties approved for certification by various specialty boards and for accreditation by the ACGME. The result was an increased workload for the RCs, due to the number of applications by academic medical centers and community teaching hospitals for new specialty and subspecialty programs. While members of RCs were expert evaluators of programs in their specialty, lengthier review agendas, coupled with the added need to understand complex issues from an institutional perspective, underscored the growing need for a separate RC to review institutions. It also became clear to the ACGME that a cadre of individuals familiar with GME from an institutional perspective, and with knowledge of the General Requirements, should provide an expert review for institutional compliance.In response to these internal and external factors in the health care delivery environment, in GME itself, and within its own internal structure, the ACGME officially constituted the Institutional Review Committee (IRC) in 1995. The IRC was structured in the same manner as the specialty-specific RCs, except with regard to appointment of its members. In contrast to the RCs, the IRC had no formal specialty board or academic medical society of individuals with leadership responsibility for GME at the institutional level from which to draw its membership. As a result, the appointing organization for the IRC became the ACGME itself, with the original members drawn from GME leadership across the country.The only options available to the newly formed IRC with regard to the review of SIs remained the previously used statuses of favorable and unfavorable. The outcomes of these reviews were not, strictly speaking, accreditation decisions, although common usage in the field often referred to them as such. Loss of the ability to sponsor accredited programs remained as the action after 3 unfavorable decisions were given to an institution. Although such a decision was never made by the IRC, it nonetheless highlighted the growing importance of institutional review. While the seeds of accreditation were sown, the time for official institutional accreditation would wait for nearly a decade.As health care delivery and funding became more complex through the 1990s, the Institutional Requirements became more detailed. Experience with the institutional review led the ACGME and the IRC to identify with greater specificity the various elements and relationships necessary for maintaining an effective SI in this environment. Managing GME relationships across the institution was an effort that assumed progressively more responsibility and importance. For example, in 1993, the standard calling for each SI to have a Graduate Medical Education Committee (GMEC) was added to the Institutional Requirements; in 1997, the appointment at each SI of a Designated Institutional Official (DIO) as chief administrator for GME was approved. The DIO and GMEC were charged with oversight of the SI's GME efforts. The DIO became the “designated” leader with whom the ACGME could communicate information that affected all programs in the SI, starting with the basic responsibility of receiving the ACGME's annual invoice for accreditation services. Eventually, the DIO became the individual in the SI who was recognized as having the authority to act on behalf of the entire ACGME-accredited enterprise. The DIO had responsibility for accreditation matters, and also assumed many locally defined duties commonly associated with management and supervisory oversight in a complex organization. The Institutional Requirements began to codify the relationships needed at the institutional level for support of the educational environment, and the formal structures that comprised the GME local learning community took shape.In the mid- to late 1990s, the ACGME underwent its first intensive strategic planning process. A central component of the resulting plan, which eventually became known as the Outcome Project, was initiated in 1999 with a grant from the Robert Wood Johnson Foundation. The central focus of this effort was the identification of 6 general competencies as organizing principles for specialty and subspecialty curricula. ACGME-accredited programs were expected to identify educational outcomes for residents and to evaluate their achievement based on the general competencies that were eventually codified into all specialty and subspecialty program requirements. During this period, the educational leadership role of the SI's DIO and GMEC became even more apparent to the ACGME, the IRC, and SIs themselves. The need for central oversight of program curricula and resident evaluation processes emerged as a critical institutional responsibility for maintaining overall educational effectiveness within each program and across the institution.After 10 years' experience observing the maturation of the institutional review process and monitoring the work of the IRC, the ACGME Board of Directors delegated full accreditation authority to the IRC in 2005. The IRC acted with this authority to accredit all SIs having 2 or more accredited programs. With this change, the IRC maintained the same structure as all RCs, with the exception of its member selection process, which remained open to nominations of individuals from the entire GME community.In 2007, the IRC completed a major revision of the Institutional Requirements in an effort to remove extraneous language and to reflect the increasing recognition of the SI's central role in overseeing GME. As further indication of the growing importance of institutional review, this revision did not occur in a vacuum. DIOs and GMECs had begun to report that areas of programs' noncompliance cited by RCs occasionally overlapped with citations at the institutional level given by the IRC. One reason for such overlap (and sometimes even dissonance or contradiction) occurred because the Institutional and Common Program Requirements had developed independently. Therefore, an ad hoc committee that included representatives from the specialty RCs and the IRC was created to reconcile these 2 sets of requirements. The result was that a major revision of the ACGME Common Program Requirements also occurred in 2007 and included a thorough reconciliation with the Institutional Requirements. This reconciliation initiative revealed how institutional accreditation gradually had come of age within the ACGME accreditation structure and at the grassroots level. Specialty-specific RCs accepted the IRC's expertise in institutional matters and were no longer compelled to cite expectations outlined in the Institutional Requirements.Consistent with the recognition of the importance of institutional review, the IRC chair eventually became a full voting member of the ACGME Council of Review Committee Chairs, thus assuring an opportunity for ongoing dialogue between the specialty RCs and the IRC. The relationship of program and institutional accreditation was forged; it mirrored the expectation of collaboration among specialty and subspecialty programs at the level of the SI, facilitated by the DIO and GMEC.Today, institutional accreditation is an integral component of the ACGME's ongoing strategic initiatives. With approval of its Next Accreditation System in 2011, the ACGME Board of Directors codified the responsibility of SIs for educational outcomes: “The ACGME accredits GME programs and SIs based on the demonstration of continuous oversight of processes and outcomes of education, and substantial compliance with accreditation standards, through the review of annually acquired information.”4 Effective institutional oversight of a single program or 100 programs is the expectation for institutional outcomes under scrutiny by the IRC as it monitors institutions on an annual basis. Recognizing this central role of the SI in monitoring outcomes, the IRC completed an extensive revision of the Institutional Requirements in 2013. This time, the revision focused on a major reordering and simplification of the language so that additional focus would be placed on demonstrable outcomes at the institutional level, even as the processes evident in the original standards remained.Yet another development in institutional accreditation occurred in 2014 when the ACGME, the American Osteopathic Association (AOA), and the American Association of Colleges of Osteopathic Medicine (AACOM) announced their agreement to form a single GME accreditation system in the United States. This decision allows graduates of allopathic and osteopathic medical schools to complete their residency and/or fellowship education in ACGME-accredited programs and to demonstrate achievement of common milestones and competencies. The move toward a single accreditation system means that many AOA-approved institutions and other entities that oversee AOA-approved programs have applied or will apply for ACGME accreditation as SIs. This landmark agreement has resulted in an expanded assortment of governing structures for SIs in which residents and fellows are educated. In addition to commonly encountered institutional models overseen by academic medical centers, community teaching hospitals, medical schools, and single-program institutions, newly ACGME-accredited SIs include osteopathic postdoctoral training institutions (OPTIs), which are educational entities that provide academic oversight and ensure GME resources for hospitals, colleges of osteopathic medicine, clinics, and teaching health sites, sometimes over large geographic areas.In 2012, the ACGME Board of Directors approved the ACGME's Clinical Learning Environment Review (CLER) program, which “provides the profession and the public a broad view of SIs' initiatives to enhance the safety of the learning environment and to determine how residents are engaged in patient safety and quality improvement activities.”5 Although not an accreditation activity, the CLER program emphasizes the institutional relationships necessary to improve health care and population health.Sponsoring Institutions now define themselves in various ways, often departing from the model of the singular hospital originally identified in the earliest standards for accreditation. What has remained consistent throughout the development of the institution's role in GME, however, is that it is the organizing force that affects GME at all levels, and that creates space for relationships that ensure the quality of resident education.Sponsoring Institutions are currently in a state of accelerated evolution in response to major changes underway in the health care system and beyond. The next stage in the maturation of SIs will necessarily benefit from reflecting on where these changes will lead. It will require the collaboration of the entire GME community to understand how SIs can meet the needs of patients by creating physicians who are prepared to practice in 2025. As in the past, it is through these relationships that future SIs will realize our future vision for GME.
- Research Article
- 10.4300/jgme-d-21-00197.1
- Apr 2, 2021
- Journal of Graduate Medical Education
This article is the third in a 6-part series to chronicle the processes, work, and outcomes of the ACGME Pursuing Excellence Pathway Innovators Project. These articles provide an overview of the project, detail the 4 drivers developed to define the project, and present the evaluation process developed. This article highlights initiatives that integrate health system learners into quality, safety, equity, and value processes of health care organizations.While the National Academy of Medicine's To Err Is Human1 and Crossing the Quality Chasm2 ushered in a new focus on quality and safety, and Unequal Treatment3 focused on health disparities, the ACGME Clinical Learning Environment Review (CLER) Program centered the importance of these principles in the clinical learning environment (CLE).The CLER Program, developed in 2012, increased the focus on Sponsoring Institutions' engagement of residents and fellows in quality improvement and patient safety within the CLE.4 The new focus was in part motivated by literature describing wide variations in care for standard practices, value-based care, and patient outcomes for similar conditions across geographic regions and health care systems. Layered on this is a growing body of literature demonstrating that residents and fellows who practice in health systems with greater attention to high-quality care delivery carry these behaviors forward into practice, and therefore their patients may experience better health outcomes.5The first CLER National Report of Findings published in 2016 highlighted significant variation between institutions in the degree to which learners meaningfully engaged in activities related to quality, safety, equity, and value.6 For example, although residents were often aware of institutional quality priorities, they often defaulted to existing solutions and/or guidelines, rather than using knowledge and skills to develop and apply new approaches. Residents' improvement work was often siloed by profession and department, with limited engagement in interprofessional teams. Specific improvement efforts identified by residents related to disparities focused on provision of services to low-income or marginalized populations, rather than linking quality to equity. Robust programs to engage residents and fellows in identifying and reducing local disparities were lacking. Lastly, a lack of experiential engagement in patient safety at all levels was observed, from frontline incident reports to high-level engagement with service- and institution-specific safety priorities.The Pursuing Excellence in Clinical Learning Environments initiative (Pursuing Excellence) was created to address the variation across Sponsoring Institutions identified in the CLER National Report of Findings 2016 by testing innovations as part of a longitudinal social learning model. A core activity of Pursuing Excellence was the establishment of the Pathway Innovators Collaborative with the aim of enhancing the integration of graduate medical education and the health care delivery system to enable measurable improvement in patient outcomes and learner experience.7 Through a competitive application process 8 diverse Sponsoring Institutions,8 varying in size and governance structures, were selected to participate in the Pathway Innovators Collaborative. Their charge was to work within their organizations and across Sponsoring Institutions to create sustainable and adaptable projects and processes that could improve the CLE and be shared with the entire GME learning community. This article provides a detailed perspective on the second of 4 major areas of work within the Pathway Innovators Collaborative.Four primary drivers for improvement of the CLE for residents and fellows were identified to guide the Collaborative. These primary drivers included focusing on alignment of the CLE and GME, engaging residents and staff in quality, safety, value, and equity, faculty development, and interprofessional learning. All of these primary drivers included multiple secondary drivers (see Figure). Half of the teams began by focusing their initial efforts on Driver 2, “Establish the processes and practices that fully integrate clinical learning environment staff and learners into the pursuit of quality, safety, equity, and value in the organization.” This approach reflects the growing recognition of health systems science that complements and synergizes with traditional basic and clinical sciences. All 8 sites met through sequential learning sessions, including in-person site visits, guided by a 4-year curriculum to refine their innovations.The Collaborative teams noted several factors to consider for implementation of Driver 2. A key factor was that patient safety and quality improvement work often relied on established nursing unit–based structures. The transient nature of residents was not conducive to a shared learning experience. While residents commonly rotate through units, they are not a permanent part of the unit-based team, and may work on multiple rotations across units, in some cases across multiple hospitals. Also, many units held quality or safety huddles at set times to review institutional and local metrics, and residents were often not aware of those huddles or able to attend due to other clinical or educational obligations. The lack of integrated resident and unit-based quality and safety work could result in projects at risk of not being sustained after a resident rotated off-service or graduated.An additional factor was lack of improvement expertise available at the point of care where residents work. Many institutions have improvement experts charged with overseeing institutional improvement initiatives (ie, Operational Excellence, Lean Transformation); however, that expertise was often not known or available to residents. Likewise, residents and fellows often did not know what quality data were routinely collected or how to access these data to drive improvement. Recognizing these challenges, the Collaborative undertook the work of implementing and sustaining an infrastructure that aligned and integrated learners with staff through a specified set of action steps articulated below.Each of the Collaborative teams utilized the secondary drivers of the Driver Diagram to begin the journey toward integrating residents and fellows in the quality, safety, value, and equity mission of the Sponsoring Institution. Notably, the Collaborative found it more practical and logical to approach the secondary drivers in the reverse order on the driver diagram. Articulated below is how the teams implemented each of the secondary drivers, presented in the sequence we propose for future use. To make progress, some overlap will likely be required. Specific examples of team approaches are in the Table.Obtaining institutional resources and infrastructure to engage residents into quality, safety, equity, and value work is foundation to successful integration. This requires demonstrating a return on investment, allowing health system leaders to see this work as mission critical to health care delivery. Resources and infrastructure not only come in the form of leadership buy-in, but also in staff to support the work of engaging residents. The participants noted that this work can be both an educational and a clinical priority for the institution. One important way that alignment was achieved in the Collaborative was the explicit involvement of institutional leaders on each Pathway Innovators Steering Committee. This helped institutional leaders gain better understanding of the work of graduate medical education and promote and shape institutional strategies. Residents and fellows can identify challenges and become important allies to identify and solve systemic problems. In addition, each participating team created a business plan to gain buy-in from the C-suite by demonstrating the return on investment of meaningfully preparing faculty to effectively integrate residents, as well as other staff, into institutional improvement efforts. Interprofessional quality and safety initiatives were amplified and strengthened with broader participation. Bringing groups together helped standardized work and contributed to other efforts to improve patient outcomes, including length of stay, discharge planning, and interprofessional teamwork.A key element of success is ensuring that residents are not only aware of, but that they engage in health system–level approaches to improving quality, safety, value, and equity. The participants noted that it is especially important to ensure maximal impact and alignment of resident work with that of the health system. While many of the Collaborative organizations use Lean process improvement to drive transformation, residents were often an afterthought in the design of institutional improvement activities. The Collaborative helped institutional leaders reassess their Lean process improvement activities to include residents and fellows so they are able to enact changes that can improve patient care, interprofessional learning, and participation. These changes can be embedded into the system and thereby sustained after residents graduate. For example, several of the teams (Cleveland Clinic, Maine Medical Center, UCSF, University of Chicago, and University of Rochester) specifically integrated residents into existing systems-based improvement work using Lean transformation teams and Operational Excellence so that all residents learn and apply the institutional approach to quality improvement (Table).Local expertise, including coaches, can be an important and practical approach to successfully training residents to engage in institutional quality, safety, equity, and value work. Institutions developed models which fit with their existing institutional frameworks. Examples include bringing together groups of learners and frontline providers with institutional improvement experts (Rochester) and training a cadre of interprofessional faculty coaches to support ongoing trainee-led improvement work (Cleveland Clinic, UCSF). Some Collaborative teams used in-situ simulation to do this (Children's National), while others trained faculty to lead quality improvement discussions in the context of rounds (Our Lady of the Lake).Health care quality pioneer Ernest A. Codman, MD, stated, “To effect improvement, the first step is to admit and record the lack of perfection.”9 Residents are accustomed to receiving subjective feedback about their clinical performance. They may be less likely to receive or understand their patient outcomes for a variety of reasons, including lack of data and lack of appreciation of their own impact.10 Two key elements to consider when providing outcomes data are the extent to which the data are timely and actionable.11 In many large programs, residents rotate between services, units, and even health systems on a monthly basis. If residents are to change their own behavior based on data, they must receive data while they are still on the rotation. It's not necessary they receive their own individual data, as residents are members of teams affecting the care of a patient. However, at a minimum, they should have benchmarks to understand what they should aim to achieve.Residents should view the data as actionable (eg, residents' behavior changes may reasonably result in outcome changes). One can identify examples in which the link is linear (eg, timely removal of Foley catheters leads to fewer CAUTIs) or less linear (patients filling their post-discharge medications may depend on factors that are both within and beyond the trainees' control). Teams noted that faculty (physicians or non-physicians) need to be equipped to help residents understand how to interpret and contextualize data to drive behavior change which leads to better patient outcomes. In the collaborative, this often meant close alignment with the chief quality officer or quality staff to provide residents with access to existing quality dashboards or designing and deploying new ones.As checklists, backup processes, and other types of improvement work have become “standard work,” health care systems have come to depend on non-physician clinicians and staff to ensure that best practices are followed. Examples include having operating room nurses take responsibility for time outs or clinical pharmacists review medication orders for safety. Clinical educators can take advantage of teaching opportunities to focus on topics such as improvement science, rounding on visibility boards, or interprofessional collaboration. Faculty can role model how these systems-based topics are integrated into daily practice, rather than being viewed as an “add-ons.” For example, at Dell Medical School, faculty on one of their medicine teams, named the “Green Team,” test innovations in care while on service with their learners.The Pathway Innovators experience demonstrates that the secondary drivers can be used to guide integration of residents and fellows into the quality, safety, equity, and value mission of the Sponsoring Institution. Our experience was that the order of the steps mattered. In fact, for many participants, we found the ideal order was the reverse of how the secondary drivers were originally presented. It is important to note that to make progress, secondary drivers may need to be addressed concurrently, as appropriate to the institution or activity.While provision of resources and infrastructure are foundational, governance of most institutions was not designed to account for residents, making it harder to solicit support and create authentic roles for residents. Ironically, newer institutions may have an advantage because they can prospectively design systems to meaningfully integrate residents.Going forward, all institutions should be challenged to rethink CLEs to ensure that residents and fellows receive experiential learning not only in clinical care but also in the topics of quality, safety, equity, and value foundational to health systems science. For example, existing clinical rotations can be redesigned to incorporate a focus on systems-based practice, as a value add for the health system and for patient care. While documenting the impact of residents on quality, safety, equity, and value has been done and can be labor-intensive, measuring outcomes and return on investment is critical to sustainability and long-term commitment of resources.Perhaps most importantly, residents and fellows must also see the value in this integration. With increased workload and attention to well-being and burnout in training, faculty are obligated to demonstrate why quality, safety, equity, and value matter for patients, as well as for physician professional satisfaction and development. Leveraging external pressures can be a strategy for institutions to meaningfully engage residents and fellows.
- News Article
7
- 10.4300/jgme-d-22-00938.1
- Feb 1, 2023
- Journal of Graduate Medical Education
Introduction to the CLER National Report of Findings 2022: The COVID-19 Pandemic and Its Impact on the Clinical Learning Environment.
- Research Article
20
- 10.1097/acm.0000000000001577
- May 1, 2017
- Academic Medicine
In 1999, an Institute of Medicine report spurred health care organizations to implement systems-based quality improve ment efforts and tackle patient safety. Simultaneously, the Accreditation Council for Graduate Medical Education asked residency programs to address Practice-Based Learning and Systems-Based Practice competencies. Medical educators now advocate incorporation of these competencies in undergraduate medical education.The authors examine the success of these efforts both from the health care delivery and systems perspective as well as from the perspective of educators as they aspire to engage medical students and residents in these domains. The authors argue that the missing element that prevents health care systems from the full realization of the promise of quality improvement is bidirectional alignment. Included are examples from the literature to demonstrate how medical educators are moving toward alignment of learners with health system quality improvement and safety needs. Finally, the authors explore business and information technology governance literature in support of the hypothesis that bidirectional alignment should be the next step in moving from reactive to proactive systems of care.
- News Article
5
- 10.4300/jgme-d-21-00794.1
- Oct 1, 2021
- Journal of Graduate Medical Education
It has been 5 years since the release of the first CLER National Report of Findings, which focused on the larger Sponsoring Institutions, encompassing data from 297 clinical sites. In contrast, this National Report presents findings from 566 clinical learning environments (CLEs) associated with Accreditation Council for Graduate Medical Education (ACGME) Sponsoring Institutions (both large and small).1 It provides a number of insights as to how hospitals, medical centers, ambulatory care sites, and other clinical settings serve as teaching environments for the approximately 145 000 resident and fellow physicians participating in more than 12 000 ACGME-accredited programs. This report has several unique features that will inform the ACGME, the graduate medical education (GME) community, and the public about these important environments where learning occurs in the context of providing patient care. First, both larger and smaller Sponsoring Institutions concurrently participated in CLER site visits in a single time period for the first time. Second, the report presents trends across several of the CLER Focus Areas for a subset of approximately 240 CLEs that have completed 3 CLER visits. Third, the report includes findings from institutions that achieved ACGME accreditation status through the Single Accreditation System and progressed past initial accreditation. Last, the report reflects findings of a new focus area called Well-Being.The findings demonstrate that CLEs exhibit some common features with regard to the focus areas, irrespective of CLE bed size (ie, acute bed count), geographic location, or the type of ownership of the clinical site. The findings also suggest that there are some notable differences seen in the focus areas based on CLE characteristics. For example, there were significant differences in the percentage of residents and fellows who reported (1) participating in a quality improvement project linked to one or more of the clinical site's quality improvement goals; (2) following a standardized process for handoffs between shifts that included a standardized written template for communication; and (3) that based upon their experience at the clinical site, faculty members often or always disclose whether or not they have potential conflicts of interests during each of their clinical rotations. Over time the CLER Program will be seeking to both understand these differences and identify successful, albeit potentially different, approaches to optimizing the various CLEs across the range of ACGME-accredited Sponsoring Institutions.As noted earlier, this report provides a first look at trends across 3 time periods for a subset of Sponsoring Institutions whose principal CLE participated in 3 successive CLER visits. The size and scope of this analysis is aided by a small set of questions that remained the same in all 3 cycles of CLER visits. Some interesting observations emerged from this view. Specifically, there has been demonstrable improvement in GME involvement in addressing patient safety. Patient safety has been a major focus of the CLER visits and the attention to this important and critical area of health care is reflected in the signs of improvement. Overall, CLEs appear to have dramatically increased their attention to resident and fellow access to and use of patient safety event reporting systems. For example, in Cycle 1, approximately one-third of the CLEs indicated they tracked the number of patient safety event reports submitted by residents and fellows; in Cycle 3, the percentage had increased to 80%. Similarly, there has been a nearly 20% relative increase in the percentage of residents and fellows who reported into their CLE's patient safety event reporting system.This report also highlights areas in need of additional attention—specifically in engaging residents and fellows in patient safety event analysis, which has not improved across the 3 cycles. The lessons learned from the CLER Program's Pursuing Excellence Pathway Leaders Patient Safety Collaborative2 and the Program Directors' Patient Safety and Quality Educators Network (a collaborative effort of the ACGME, Project ECHO, and the Organization of Program Director Associations) provide an evidence base and new approaches to addressing this challenging and important finding.The findings related to health care disparities, while more modest, indicate for the first time that CLEs are starting to recognize the importance of this issue. During the site visits, more CEOs and their executive teams were starting to have open discussions on the need to examine risks for health care disparities within the populations served in the CLE. It is important to remember that these findings reflect conversations held with the CLE executive leaders prior to the start of the COVID-19 pandemic. It will be informative to see how the commitment to and success in elimination of disparities may improve in the next cycle of visits given the high degree of visibility of health care disparities revealed by the COVID-19 pandemic.For several of the CLER Focus Areas, the report presents trends that show no change and in some cases trends in an undesirable direction. These findings will be the source of important reflection and possible intervention as they are further studied.The report also identifies some new challenges. In examining the approximately 50% of residents and fellows interviewed who reported encountering a physician (attending physician or consultant) that made them feel uncomfortable when requesting assistance, the report notes this was more prevalent among the residents who were earlier in their postgraduate training. These findings indicate both suboptimal educational experiences, inadequate implementation of appropriate supervision and mentoring, and challenges to the culture of safety and patient care.Along similar lines, of the one-third of residents and fellows interviewed who reported that they would "power through" to hand off even if they were impaired by fatigue, most were in their early years of postgraduate training.One of the key areas highlighted in this report is the new focus area of well-being. Well-being is 1 of the 4 elements of the quadruple aim and is integral to the ACGME's mission.3 In 2017, the ACGME joined with the National Academy of Medicine and other members of the health care and medical education community in ongoing efforts to address clinician well-being and resilience—specifically the challenges posed by the rapid changes in health care organizations and in patient needs.4 This report presents the first national data that characterizes many aspects of well-being within the nation's CLEs. In gathering the data, the CLER Site Visitors focused on 4 priority areas as delineated in the CLER Pathways to Excellence, version 2.05—work/life balance, fatigue, burnout, and support of those at risk of or demonstrating self-harm.While the report reveals a number of interesting findings on well-being, 2 issues are particularly noteworthy. The first is not formally reflected in the findings, rather it relates to observations of the CLER site visitors. As part of the site visit protocol, the CLER team asked to meet with the individuals responsible for leading the CLE's efforts to address well-being. The site visitors informally noted that a variety of individuals attended these meetings. They noted the well-being representatives could often speak to isolated well-being activities for individuals or professions. However, absent from these conversations was a cohesive effort on the part of the CLE to implement a common strategy to address the well-being of the clinical care team. The other issue relates to the finding noting the types of interventions that were being planned or implemented in the CLEs. The finding noted examples of new efforts to identify individuals at risk, especially efforts to identify residents and fellows, and efforts to build resilience in the clinical care team. There were few examples of CLEs that were able to describe a strategy or substantive efforts to address the system-based factors that were creating challenges to well-being.As with just about everyone else during 2020–2021, the COVID-19 pandemic has caused the ACGME CLER team to reflect on opportunities to better understand the impact of this catastrophic societal challenge. In response, the CLER Program has launched a specially designed site visit to understand the pandemic's impact on the CLEs.6Looking forward, the CLER Program is seeking to use the knowledge from this and prior reports to inform a metamorphosis and transformation of thinking in how to best assess, understand, and inspire the CLEs of ACGME Sponsoring Institutions. This will include efforts to advance CLER site visit program's configuration to build on the experiences of the CLER-COVID protocol, possibly retaining some of the new features such as a sampling approach to selecting Sponsoring Institutions and CLEs visited, maintaining the new model of additional advanced notification for scheduling, incorporating new surveys for executive leadership, and possibly integrating some component of remote interviews. Any changes to the site visit protocol will still need to retain in-person visits to facilitate walking rounds. The impromptu conversations that happen on the walking rounds of the floors and service areas of the CLEs provide essential perspectives and insights from other members of the clinical care team (eg, nurses, pharmacists, social workers, respiratory technicians) and additional members of the GME community who do not participate in the formal group meetings.As modeled in the CLER Program's Pursuing Excellence initiative, the CLER Program will continue to seek opportunities to use the information gained from this report and future visits to provide the GME and CLE leadership with the knowledge and tools to advance improvements in their CLEs. The aggregate information in this and past CLER National Reports will also serve as an evidence base to inform the upcoming major revision of the ACGME Institutional Requirements.Of final note, the CLER Program will also be undertaking efforts to transform its body of work to increasingly focus on an outcomes-oriented perspective of CLE performance. There are currently a number of efforts nationally to assess the quality of care provided in the many types of CLEs that host GME. The National Academy of Medicine recognizes attention to health care outcomes to be a major priority if the United States is to achieve the goal of better health at lower cost.7 Through the CLER Program, the ACGME will be exploring the resources available to better align and integrate GME performance with this national direction of measuring health care quality outcomes. While this effort is in its early stages, over time this will be an essential tool to aid the ACGME's ability to provide the best possible formative feedback to the GME community and their CLEs to optimize learning and patient care in the framework of the quadruple aim.This report would not have been possible without the support of the ACGME Board of Directors, oversight by the CLER Evaluation Committee, the efforts of the CLER program administration and field representatives, and most importantly the engagement of the entire GME community and the leadership of the hospitals, health systems, and other clinical sites that serve as US clinical learning environments seeking to provide the best patient care.
- News Article
2
- 10.4300/jgme-d-21-00210.1
- Apr 2, 2021
- Journal of Graduate Medical Education
Since its inception, the Accreditation Council for Graduate Medical Education (ACGME) Clinical Learning Environment Review (CLER) Program has sought to create a conversation about how the hospitals, health systems, and other clinical care settings that host ACGME-accredited residency and fellowship programs serve as clinical learning environments (CLEs) for our nation's resident and fellow physicians.1 Over the past 5 years, the CLER national reports have provided the leaders of graduate medical education (GME) and the executive leaders of CLEs with new information aimed at optimizing learning and patient care.2–4From the beginning, the CLER Program has experienced challenges in comprehensively including the operative and procedural areas as part of the site visit protocol. The CLER Program recognized the importance of understanding these key clinical areas—both the implications for patient safety and health care quality5,6 and the implications for how residents and fellows learn in these environments. In its third cycle of visits, the CLER Program implemented a subprotocol in parallel with the regular visit to a sample of 25 of the larger Sponsoring Institutions with ACGME-accredited programs in surgical and anesthesia specialties. The subprotocol specifically addressed the challenges that made it impractical to include the operative and procedural rooms in the regular CLER visit. The main protocol and associated subprotocol explored the 6 focus areas of patient safety, health care quality (including health care disparities), care transitions, supervision, well-being, and professionalism.The teams for these augmented visits were enhanced with 2 to 4 additional CLER Field Representatives with backgrounds in surgery or anesthesiology. The team members responsible for the subprotocol joined the other members of the CLER site visit team for the initial and exit meetings with executive leadership and the meeting with the leaders in patient safety and quality. Aside from these meetings, they focused exclusively on the operative and procedural areas of the clinical site.The subprotocol included scheduled meetings with physician and nursing leaders in surgical and procedural areas and meetings with operating room nurses. However, the majority of the subprotocol team members' time was spent on walking rounds observing the preoperative, operative, and postoperative care units, and talking with various members of surgical and procedural teams.The CLER Program released the first report of findings from the subprotocol in March 2021.7 This report provides an important look at these unique CLEs. As with the larger CLER national reports, the key findings of the subprotocol highlight a mixture of strengths and opportunities for improvement—some unique to the perioperative environment and some that are similar to other places within the CLE. Dr Thomas Nasca, President and Chief Executive Officer of the ACGME, notes in his introduction to the report that the findings are important in that they reveal unexpected attributes of the learning environment that may spur new thinking about opportunities to improve the operative and procedural experiences for residents and fellows. The following findings were highlighted as possible opportunities for future conversations:In addition to these selected findings, the report also includes a rich set of additional findings and related discussions authored by volunteer members of the CLER Evaluation Committee and a National Advisory Group to the subprotocol. These sections encourage the leaders of hospitals, medical centers, and other clinical settings that have residents and fellows in the operative and procedural areas to think differently about how GME provides new opportunities to improve patient safety and health care quality in these complex and critical areas of patient care. Importantly, the findings and discussions encourage CLEs to cultivate future leaders within the surgical and procedural specialties who are committed to systems-based approaches to optimizing patient care.
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.