Abstract

Merriam-Webster defines quality as a degree of excellence.1 What is left unstated is how degree and excellence are defined. Does this suggest that the quality of medical education research, like beauty, lies in “the eye of the beholder?” Can we measure quality objectively and consistently or is it subjective and contextual, varying with the type of research question, reviewers' judgments, or quality indices applied? Do these factors capture the aspects of quality that you, our readers, value? We pose these questions for your consideration as you read the following review papers published in this issue of the Journal of Graduate Medical Education (JGME). Locke and colleagues2 reviewed graduate medical education (GME) research papers published in 2011 and selected the 12 articles they considered to be of the greatest importance to internal medicine teachers. With a similar target audience, Eaton et al3 used the Medical Education Research Study Quality Index (MERSQI)4,5 to score internal medicine residency quantitative research papers over a 2-year period. The authors then reviewed the papers ranking in the top 25th percentile for common themes. Examining papers in the surgical education literature published over a decade, Wohlauer and colleagues6 identified common themes and research methods through reviewing the most frequently cited articles in Web of Science, as a surrogate for relevance and quality. Each review aims to identify notable medical education papers for a specific audience and time period, but each takes a different approach. Despite overlapping themes (common topics were simulation, duty hours, resident well-being or distress, resident assessment, and career choices), these 3 reviews achieved different results. Of note, the reviews by Locke et al3 and Eaton et al2 had comparable target audiences, search techniques, and journals reviewed, yet they identified only 2 common papers. The differences may be explained by the use of dissimilar quality criteria, exclusion of qualitative papers for 1 review and only a 50% overlap in review periods. However, the finding that 2 selection processes with a similar aim resulted in almost mutually exclusive results remains striking. The lack of a common definition of quality for medical education research does not stem from a lack of prior efforts to both define and improve the quality of our studies. In addition to the MERSQI, other instruments exist to measure quality in quantitative studies, such as the Best Evidence in Medical Education Global Scale and the Modified Newcastle-Ottawa Scale.7,8 These instruments vary in their (1) incorporation of items that address methodological rigor, (2) reliance on outcome quality based on Kirkpatrick's hierarchy of outcomes of educational interventions, and (3) their association with quality based on a systematic review of method and reporting quality in education research.9–,11 Although methodological rigor is the foundation of quality, attempts to boost quality by focusing on rigor at the expense of other aspects of quality can sometimes diminish the value of the results for consumers. Even the emphasis on outcomes research, a well-intentioned effort to encourage studies that address the highest tier outcomes (patient care or physician behavior outcomes) may result in the unintended consequences of dilution, diminished feasibility, failure to establish a causal link, biased outcome selection, and “teaching to the test.” 12 In addition, we understand that consumers of education research may place value on factors that are not captured by available instruments and that may be neglected by a myopic focus on only the pinnacle of Kirkpatrick's pyramid. The definition of quality for a given product is usually informed by the consumers of that product. Readers of JGME may value elements of quality that are not currently captured by available instruments or methods; we are seeking your input to guide us in future efforts to identify notable medical education papers and help redefine quality in our research.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.