Abstract

This section includes detailed findings from the second set of visits (2015–2017) of the Clinical Learning Environment Review (CLER) Program. The findings in the 6 CLER Focus Areas1 are based on site visits to the major participating clinical sites (ie, hospitals and medical centers) for 287 Accreditation Council for Graduate Medical Education (ACGME)-accredited Sponsoring Institutions (SIs) with 3 or more core residency programs.23 These clinical sites serve as clinical learning environments (CLEs) for the SIs.Collectively, the 287 SIs oversee 9167 ACGME-accredited residency and fellowship programs, with a median of 20 programs per SI. These larger SIs account for 87.1% of all residents and fellows in ACGME-accredited programs—with a range of 17 to 2156 trainees per SI (median = 246).Approximately 28% of the CLEs were located in the Northeast region of the United States, 30.3% in the South, 26.5% in the Midwest, and 14.6% in the West. The sites ranged in size from 107 to 2654 acute care beds (median = 528). The majority (67.2%) were nongovernment, not-for-profit organizations; 23.3% were government, nonfederal; 5.9% were investor-owned, for-profit; and 3.5% were government, federal. Although the CLER teams spent the majority of their time at inpatient settings, they also sometimes visited affiliated ambulatory care practices in close proximity.In total, the CLER teams interviewed more than 1600 members of executive leadership (including chief executive officers), 9262 residents and fellows, 8164 core faculty members, and 6034 program directors of ACGME-accredited programs in group meetings. Additionally, the CLER teams interviewed the CLEs' leadership in patient safety and health care quality and thousands of residents, fellows, faculty members, nurses, pharmacists, social workers, and other health care professionals while on walking rounds in the clinical areas.As previously described in the CLER National Report of Findings 2016,4 these findings are based on a mixed methods approach to data gathering and analysis to improve the accuracy of the findings by combining quantitative, descriptive, and qualitative evidence in a complementary manner. As such, some of the findings are represented quantitatively while others are described qualitatively.The combination of methodologies and varied representation of findings should be considered when interpreting the results, making comparisons, or drawing conclusions. Both supporting and conflicting evidence may be presented to explain or qualify findings. For example, results from the group interviews may appear more positive than information gathered on walking rounds. Alternatively, practices reported during group interviews may have been verified on walking rounds.During the group interviews with residents and fellows, faculty members, and program directors, an electronic audience response system (ARS; Keypoint Interactive version 2.6.6, Innovision Inc, Commerce Township, MI) was used to collect anonymous responses to closed-ended questions. The results from the ARS were analyzed at both the individual (eg, residents and fellows) and the CLE levels.At the individual level of analysis, results are presented as percentages of the total number of individuals surveyed. For example:At the CLE level of analysis, individual responses were aggregated at the CLE level and results are presented as median and interquartile range (IQR) percentages. For example:Statistically significant differences (ie, P ≤ .05) in responses due to resident and fellow characteristics (eg, residency year) and CLE characteristics (eg, bed size) are also reported. Of note, statistical significance does not always imply practical significance. For example, differences in responses by residency year may be statistically significant but the differences may not be meaningful or large enough to have practical relevance or implications.As described in the Methodology section,5 this report contains a specific set of descriptive terms that summarize quantitative results from both the ARS and specific findings that were quantified from the site visit reports. These terms and their corresponding quantitative ranges are as follows:Besides the quantitative data, this report contains qualitative data from a number of open-ended questions that CLER Site Visitors asked during group interviews and walking rounds. This information, by design, was not intended to be enumerated. For these questions, the site visit teams made an assessment of the relative magnitude of observations at each individual site. To prevent confusion, these results are presented in the report using a set of descriptive terms different from the previously described terms used for quantitative data. The qualitative descriptive terms, which are intended to approximate the quantitative terms above, are as follows:Finally, this section follows approximately the same structure as the individual CLER Site Visit reports received by participating institutions. This structure is intended to facilitate easy comparison between data from an individual site and that of this report, which aggregates results from all 287 SIs. Those who seek additional detail may consult the Appendices (p. 81–124). Appendix A contains additional information on the SIs, sites visited, and groups interviewed, Appendix B contains selected aggregated quantitative results from the group interviews with residents and fellows, and Appendix C contains qualitative information from the group interviews and walking rounds.The CLER Program explored several aspects of resident and fellow engagement in patient safety with emphasis on 5 major topics: culture of safety, use of the patient safety event reporting system, knowledge of patient safety principles and methods, inclusion in patient safety event investigations, and disclosure of patient safety events. Generally across CLEs, members of the executive leadership team identified patient safety as their highest priority area for improvement.The patient safety and quality leaders in many CLEs indicated that they periodically conduct a culture of safety survey that includes residents, fellows, and faculty members. Overall, 97.7% of the residents and fellows in the group interviews reported that their CLE provides a safe and nonpunitive environment for reporting errors.Across CLEs, physicians and other staff members also reported use of the patient safety event reporting system to report on individual behaviors. This use included reporting on behaviors in a retaliatory fashion or in a manner that could be perceived as punitive.Given this and based on the collective findings from the site visits, it is unclear as to whether residents, fellows, and other staff members perceived a safe and nonpunitive culture for reporting patient safety events.Overall, CLEs had 1 or more mechanisms for reporting patient safety events, including an online or paper-based patient safety event reporting system, a chain-of-command system that allowed events to be reported to an immediate supervisor (eg, a more senior resident or faculty member), and a mechanism to verbally report events to the patient safety staff (eg, hotline).In general, residents and fellows appeared to be aware of their CLE's process for reporting patient safety events such as adverse events, near misses/close calls, and unsafe conditions. During walking rounds, the CLER Site Visit teams also asked nurses about their CLE's patient safety event reporting system. Across nearly all CLEs (97.2%), nurses appeared to be familiar with their CLE's system for reporting patient safety events.Approximately 78% of CLEs were able to provide information on the number of patient safety event reports submitted by residents and fellows (see Appendix C1), and 70.7% were able to provide the number of patient safety event reports submitted by attending physicians. The remaining CLEs indicated that their system did not track such information. Whereas CLEs occasionally provided the Graduate Medical Education Committee and their governing body with information on the number or percentage of patient safety event reports submitted by residents and fellows, it was less common for them to routinely report the number or percentage of patient safety event reports submitted by faculty members to these same groups.Generally across CLEs, the residents and fellows interviewed on walking rounds appeared to lack understanding and awareness of the range of reportable patient safety events, including what defines a near miss/close call. In most CLEs (83.6%), nurses' understanding of reportable patient safety events also varied (see Appendix C2).Across CLEs, residents, fellows, and nurses appeared to focus on reporting sentinel events, medication errors, patient falls, and other events with harm; they did not appear to recognize near misses/close calls, unsafe conditions, events without harm, unexpected deteriorations, or known procedural complications as reportable patient safety events. Residents, fellows, and nurses appeared to have little awareness of the importance of reporting these events and how such reporting can provide valuable information for identifying system failures, addressing vulnerabilities in the system, reducing risks, and improving patient safety.Overall, 72.7% of the residents and fellows in the group interviews indicated that they had experienced an adverse event, near miss/close call, or unsafe condition while at their CLE. This experience varied by gender, year of training, and specialty grouping (see Appendix B1).Of the residents and fellows who reported that they had experienced an adverse event, near miss/close call, or unsafe condition, 49.8% indicated that they had personally reported the patient safety event using the CLE's patient safety event reporting system. Responses varied by gender, year of training, and specialty grouping. Across CLEs, the median (IQR) finding was 50.0% (37.5%–66.7%) and varied by region, CLE bed size, and type of ownership (see Appendix B2). For those who did not personally enter the patient safety event into the system, 13.6% indicated that they relied on a nurse to submit the patient safety event report, 24.4% indicated that they relied on a physician supervisor, and 12.1% indicated that they cared for the patient and chose not to submit a report.When faculty members and program directors in the group interviews were asked what process residents and fellows most frequently followed when reporting a patient safety event, 57.9% of the faculty members and 53.7% of the program directors indicated that they believed residents and fellows most often reported the event themselves using the CLE's patient safety event reporting system.In a separate query, 23.6% of the residents and fellows in the group interviews indicated that they had reported a near miss/close call event while at the CLE; responses varied by gender, year of training, and specialty grouping (FIGURE 1). Across CLEs, this finding ranged from 0% to 100%, with a median (IQR) of 23.1% (15.2%–33.3%); responses varied by region and type of ownership (see Appendix B3).On walking rounds, residents and fellows in many CLEs mentioned that they often report patient safety events locally or through their chain of command while also indicating familiarity with the patient safety event reporting system and its use. When they delegated or relied on others to report, it was unclear if these reports were formally captured in the CLE's centralized patient safety event reporting system. Residents and fellows mentioned the cumbersome process of submitting a report, the time needed to enter a report, fears of repercussion, and the uncertainty of receiving feedback as reasons for not reporting. The collective information from the site visits indicated that in 70.6% of the CLEs, resident and fellow reporting of patient safety events into the CLE's patient safety event reporting system was varied or infrequent (see Appendix C3).In the group interviews, the CLER teams also explored faculty members' and program directors' use of the CLE's patient safety event reporting system. Approximately 36% of the faculty members reported that they had personally reported an adverse event, near miss/close call, or unsafe condition in the past year (median [IQR], 35.7% [26.0%–46.6%] across CLEs). Among the program directors, 35.9% reported that they had personally reported an adverse event, near miss/close call, or unsafe condition in the past year (5.5% had no clinical responsibilities at the site). Across CLEs, the median (IQR) finding was 36.0% (27.3%–50.0%). In both groups, responses varied by CLE bed size and type of ownership.In the group interviews, the CLER teams asked residents and fellows whether they received feedback on patient safety event reports. Of those who had experienced an adverse event, near miss/close call, or unsafe condition and who had personally submitted a patient safety event report or relied on a nurse or supervisor to submit the report, 46.1% reported that they received feedback on the outcome of the report. Responses varied by gender, specialty grouping, and year of training (FIGURE 2; see also Appendix B4).Residents and fellows often mentioned receiving an e-mail acknowledging receipt of the patient safety event report. They also noted receiving requests for additional information as part of a formal patient safety event investigation. It was uncommon for residents to mention receiving information on the outcome of the investigation, including recommended actions to address vulnerabilities in the system and to improve patient safety. Across CLEs, residents, fellows, nurses, and other clinical staff expressed a strong desire to receive feedback in response to submitting a patient safety event report.Overall, CLEs varied in their processes for reviewing and prioritizing patient safety events. Residents and fellows also varied in their knowledge of these processes and often used the term “black box,” indicating that these processes were unclear. Many residents and fellows appeared to be unaware of how their CLEs use the reporting of adverse events, near misses/close calls, or unsafe conditions to improve care both broadly and at the individual departmental level. Residents and fellows were rarely involved in their CLE's process for reviewing and prioritizing patient safety events that required further investigation.On walking rounds, the CLER teams explored resident and fellow participation in the time-out process as part of patient safety practices (eg, ambulatory and bedside procedures). Across many CLEs, residents, fellows, nurses, and other health care professionals interviewed on walking rounds indicated that residents and fellows do not consistently conduct standardized time-outs before performing bedside procedures.Across most CLEs (91.6%), residents and fellows appeared to have limited knowledge of fundamental patient safety principles and methods (eg, Swiss cheese model of system failure, root cause analysis, fishbone diagrams; see Appendix C4).When asked to identify their skills in applying patient safety principles, the majority of the faculty members indicated that they were either proficient or expert (62.7% and 25.1%, respectively) in applying these skills. Similarly, most of the program directors reported themselves as proficient or expert (63.6% and 21.9%, respectively).Of the residents and fellows in the group interviews, 36.3% reported that they had participated in a structured interprofessional simulation activity related to patient safety. Responses varied by gender, year of training, and specialty grouping. Across CLEs, the median (IQR) finding was 37.1% (26.3%–50.0%), with responses varying by region and type of ownership.In many CLEs, the patient safety and quality leaders indicated that they did not track resident and fellow participation in patient safety event investigations (eg, root cause analysis). A limited number of CLEs provided the Graduate Medical Education Committee and the governing body with information regarding the number of residents and fellows who had participated in formal patient safety event investigations.The CLER teams also asked the program directors in the group interviews if they measured resident and fellow participation in patient safety event investigations. Approximately 42% of the program directors reported tracking resident and fellow involvement (median [IQR], 44.4% [30.0%–66.7%] across CLEs). Responses varied by region, CLE bed size, and type of ownership.In the group interviews, 37.6% of the residents and fellows who were postgraduate year 3 (PGY-3) and higher indicated that they had participated in an interprofessional investigation of a patient safety event that included components such as analysis of system issues, development and implementation of an action plan, and monitoring for continuous improvement. Reponses varied by specialty grouping (FIGURE 3). Across CLEs, the median (IQR) finding was 37.6% (28.6%–50.0%), with responses varying by region, CLE bed size, and type of ownership (see Appendix B5).The CLER teams also asked faculty members about their involvement in interprofessional patient safety event investigations. Approximately 64% of the faculty members in the group interviews reported that they had participated in an investigation of a patient safety event that involved physicians, nurses, administrators, and other health care professionals (median [IQR], 63.3% [53.0%–73.2%] across CLEs).Overall, the format and process of investigating patient safety events varied both across and within CLEs. It was uncommon for residents and fellows to describe involvement in comprehensive systems-based approaches to patient safety event investigations aimed at preventing future adverse events and sustaining improvements in patient safety. In general, residents and fellows described experiences that lacked the attributes of a formal patient safety event investigation with very little or no interprofessional or interdisciplinary engagement. Residents and fellows varied widely in their perceptions of what constituted a formal investigation of a patient safety event. Across many CLEs, case conferences, morbidity and mortality conferences, and grand rounds continued to be the major approach to patient safety event investigations.Faculty members and program directors indicated that departmental mortality conferences, case conferences, and online modules were other informal approaches to model elements of a patient safety event investigation.In the group interviews, 66.0% of the residents and fellows indicated that they had received training on disclosing medical errors to patients and/or families (4.5% reported that such training was not applicable). Responses varied by year of training. Across CLEs, the median (IQR) finding was 68.2% (57.1%–79.3%), with responses varying by region and CLE bed size. Of those who received training, 10.1% indicated that the training was primarily simulation based; 69.8%, didactic and/or online; 15.1%, informal; and 5.0%, other.Approximately 82% of the residents and fellows in the group interviews indicated that they knew of CLE resources to assist them in coping with a major patient safety event that resulted in a patient death (median [IQR], 85.8% [74.7%–93.0%] across CLEs; see Appendix B6 for information on variability). Of those familiar with the resources, most indicated that they would be somewhat (39.8%) or very comfortable (44.7%) in using these resources.The CLER Program explored resident and fellow engagement in improving health care quality within the context of 6 major areas: involvement in developing and implementing the CLE's strategies for health care quality, awareness of the CLE's health care quality priorities, knowledge of health care quality terminology and methods, engagement in quality improvement (QI) projects, access to quality metrics data, and engagement in CLE efforts to address health care disparities.As part of understanding the CLE's approach to improving health care quality, the CLER Site Visit teams reviewed the organization's strategic plan for quality and interviewed both executive and patient safety and quality leaders. Overall, a limited number of CLEs appeared to integrate QI within the organization as part of a system-wide, comprehensive approach to promote experiential learning and to improve quality and safety across the organization.Across CLEs, resident and fellow involvement in strategic planning for QI was uncommon. Residents and fellows often served as implementers of CLE-wide QI activities (eg, hand hygiene, reducing hospital-acquired infections, reducing 30-day readmissions).A limited number of CLEs had instituted resident and fellow committees aimed at increasing resident and fellow engagement in QI; few of these committees were integrated into the CLE's formal QI processes. In many CLEs, resident and fellow participation in institutional QI committees was uncommon; often, roles and expectations for participation were undefined or unclear. The clinical sites also appeared to have insufficient structure to allow residents and fellows to attend committee meetings regularly and to participate in meaningful ways. Additionally, residents and fellows in many CLEs were not included in the governing body's patient safety and quality committees.In general, priorities for improving health care quality varied across CLEs. However, some common themes included alignment with broad national priorities such as Centers for Medicare & Medicaid Services value-based purchasing, Core Measures, or publicly reported performance measures. Many were also highly focused on meeting specific criteria such as reducing 30-day readmissions or improving performance on metrics related to pneumonia, chronic heart failure, and surgical care improvement project measures.In the group interviews, 78.8% of the residents and fellows (PGY-2 and above) reported knowing their CLE's priorities for improving health care quality (see Appendix B7 for additional information on variability). When asked the same question, 84.4% of the faculty members and 86.7% of the program directors reported knowing the priorities. Often, the physician groups focused on departmental activities and did not describe priorities that aligned with those identified by the CLE's executive leadership or the patient safety and quality leaders. When the physicians identified priorities aligned with those of executive leadership, they were most commonly around nationally recognized measures, especially those related to programs with financial incentives such as measures from the Centers for Medicare & Medicaid Services.In 55.1% of the CLEs, the residents and fellows appeared to have limited knowledge or understanding of basic QI terminology and methods such as Lean, Plan-Do-Study-Act, and Six Sigma (FIGURE 4, see also Appendix C5). A limited number of residents and fellows could articulate the QI approach employed by their CLE in designing and implementing QI activities to improve patient care.In general, the approach to educating residents and fellows about health care QI varied both within and between CLEs. Although some type of education was common as part of new resident and fellow orientation, a limited number of CLEs aimed to provide ongoing training for all residents and fellows. Training in health care QI appeared to occur primarily within departments or graduate medical education (GME) programs, and the format, methods, and content appeared to vary widely.In 25.3% of the CLEs, the patient safety and quality leaders indicated that they monitor resident and fellow QI projects.In the group interviews with residents and fellows (PGY-2 and above), 78.3% reported they had participated in a QI project of their own design, or one designed by their program or department. Of this group, 48.2% reported that their QI project was directly linked to 1 or more of the CLE's goals; 23.3% were uncertain. Of those who reported their QI projects were linked to the CLE's goals, 74.3% reported their projects involved interprofessional teams. Appendices B8, B9, and B10 provide detailed information on variability.In the group interviews and on walking rounds, the CLER teams asked residents and fellows to describe their QI projects. Overall, residents and fellows varied in their descriptions of these projects. It was uncommon for residents and fellows to describe projects that aligned with their CLE's priorities. In most CLEs (82.2%), few described projects that included the components of a complete QI cycle (ie, Plan-Do-Study-Act) (FIGURE 5; see also Appendix C6). Often, resident and fellow participation was limited to planning and implementing a QI activity. For many residents and fellows, their QI projects did not involve formally assessing effectiveness and designing follow-up actions to adjust, support, and sustain ongoing QI efforts.It was also uncommon for residents and fellows to describe involvement in interprofessional team-based QI projects. During the interviews on walking rounds, a limited number of nurses and other health care professionals indicated that they were involved in interprofessional QI projects that included residents and fellows.When the CLER teams queried faculty members in the group interviews about their engagement in interprofessional QI projects, 72.7% reported that they had participated in a QI project with nurses, pharmacists, and other members of the health care team (median [IQR], 75.0% [65.0%–83.3%] across CLEs).In the group interviews, 74.8% of the program directors reported that their residents and fellows have ready access to organized systems for collecting and analyzing data for the purposes of QI. Electronic health records, specialty-specific clinical registries, and local, regional, or national quality dashboards were often reported as common sources of QI data. Residents and fellows often mentioned the challenges (eg, long waiting lists) in acquiring specific reports from these data sources. Many faculty members noted that residents and fellows had limited support for data analysis. When support existed, it was often a departmental resource. The type and extent of analytic support services available to residents and fellows varied both within and across CLEs.Overall, 30.9% of the residents and fellows in the group interviews reported receiving aggregated or benchmarked QI data on their own patients. Responses varied by gender, year of training, and specialty grouping. Across CLEs, the median (IQR) finding was 31.3% (22.2%–42.2%), with responses varying by region, CLE bed size, and type of ownership (FIGURE 6; see also Appendix B11).Occasionally, the patient safety and quality leaders indicated that residents and fellows receive QI data to compare the care of their own patients with others served by their clinical site.Across many CLEs, executive leaders were aware of issues of health disparities affecting their surrounding communities. Many described conducting community needs assessments to improve access to care and providing free or low-cost care and clinics for the underserved, often staffed by residents and fellows from a few core specialties (eg, family medicine, internal medicine, pediatrics, obstetrics and gynecology). A limited number of residents and fellows from other specialty and subspecialty programs reported engaging in these activities.A limited number of executive leaders spoke to health care disparities occurring within their hospital or medical center. Overall, less than 5% of executive leaders described a specific set of strategies or a systematic approach to identifying, addressing, and continuously assessing variability in the care provided to or the clinical outcomes of their patient populations at risk for health care disparities. In approximately half of the CLEs, the executive leaders, faculty members, or program directors indicated that some departments were collecting data or conducting studies related to health care disparities among specific patient populations; many of these efforts were reported as research projects.In the group interviews, 55.1% of the residents and fellows reported that they knew their CLE's priorities in addressing disparities in health care; responses varied by year of training and specialty grouping. Across CLEs, this finding ranged from 7.1% to 100% (median [IQR], 59.4% [43.2%–78.0%]). Responses varied by region, CLE bed size, and type of ownership (see Appendix B12). In comparison, 66.3% of the faculty members and 68.1% of the program directors reported that they knew their CLE's priorities with regard to health care disparities.Overall, residents, fellows, faculty members, and program directors interviewed in the group interviews were able to describe populations at risk for health care disparities at their clinical site.In the group interviews, 33.6% of the residents and fellows reported that they had received cultural competency training that was specific to populations at risk for health care disparities at their clinical site, 37.0% reported receiving training that was not specific to the CLE's patient population, 24.0% reported receiving training that was primarily informal while providing clinical care, and 5.4% indicated that they had not received cultural competency training at their CLE.Across CLEs, a median (IQR) of 32.8% (23.3%–46.4%) of the residents and fellows indicated that they had received cultural competency training that was specific to populations at risk for health care disparities at their clinical site. Responses varied by region, CLE bed size, and type of ownership (FIGURE 7; see also Appendix B13).During interviews on walking rounds, many residents and fellows described education and training in cultural competency that was largely generic and not specific to the di

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call