Abstract

Synthesis of diverse content into a coherent whole is one of the most difficult skills to learn and teach. We start with raw material, whether research data, published reports, clinical observations, or the like. Somehow we must make sense of the mix of information and communicate its meanings in a concise and coherent way to inform future action. How is synthesis done, and how can we teach it? To approach these questions, I start by examining synthesis, move to the kinds of information that must be synthesized in a literature review, and then consider how the skills of synthesis might be fostered. Those of us who teach research are familiar with the laundry-list format of many novice literature reviews. They might read, “Smith and Jones studied this in this sample and had these results, Brown and Guerrero studied something else in a different sample and found something else,” and so forth. Such a list lacks evidence of a critical appraisal of each study's rigor on its own, a summary of the methods used and what was found across studies in logically categorized subsets of research, or a nuanced appraisal of what is known about the larger question at hand, in the context of the methods that have been used to achieve that knowledge. A strong literature synthesis is a concise response to, “What do we know about this problem (its seriousness, correlates, consequences) and its solutions (interventions tested to date), and how (by what methods and measures) do we know it?” Skilled clinicians use synthesis unthinkingly every day, and they acquire the skill only over time and in context. Think of Benner's landmark work, From Novice to Expert: Excellence and Power in Clinical Nursing Practice (1984). Drawing on her analysis of intensive observations and narrative accounts of nurses in clinical settings, Benner mapped out stages of skill development from novice through advanced beginner, competent, proficient, to expert, illustrating the essential role of immersion in a clinical world over time to enable nurses to refine received facts into fluid, ready-to-hand, embodied know-how. Benner observed that novices struggled to make sense of a sea of apparently unrelated clinical cues using difficult-to-apply classroom maxims. Advanced beginners began to ignore some cues, for better or worse, in order to apply rules and protocols to the findings considered salient. As nurses progressed, they appraised and responded to perceptual cues in clinical situations with increasing efficiency and discernment, by sifting through ever-growing internalized libraries of theory, clinical patterns, and variations they had seen in a range of contexts. The expert nurse fluidly identified and judged the seriousness of the central problem in its context, thoughtfully setting aside less relevant facts and signs while acting on urgent priorities. This is synthesis at a very high level. A similar progression toward synthesis, this time in public education, was sketched in a recent op-ed column by Brooks (2015). He was responding to a documentary calling for education in a high-tech era to provide life skills practice such as collaboration and project work, rather than outdated textbook and lecture-based learning, given that computers now can store and retrieve facts so effectively. Brooks acknowledged the value of relational skills but countered that an educated citizen does need to be taught how to learn facts and work with those facts over time. High school graduates need wisdom that starts with acquisition of facts, moving with the help of teachers and peers to pattern formation, in which facts are meaningfully linked, and over time achieving a mental reformation, in which one's thinking becomes adapted to the ways of the field, and one develops a new way of seeing new facts and posing new hypotheses based on now-familiar patterns. At that point, knowledge has been achieved. Over time, as the knowledge is tested and shaped in myriad situations, an intuitive awareness, vision, and freedom to think outside the box emerges, which Brooks termed wisdom. Without enabling students to absorb and gain ownership of patterns of facts as well as life skills for a high-tech age, Brooks argued, education will not serve society. Benner and Brooks shed light on what needs to happen in learning to do a research literature review. First, facts must be learned, then versions of those facts must be recognized in practice over time, and then patterns must be discerned, characterized, and eventually tailored to new situations. The strength of evidence in past research on a given research question cannot be known without a consideration of both the methods and the conceptual foci of that body of studies. A new graduate student is probably a novice both in appraising the rigor of research and in identifying the concepts relevant to a clinical research question. These are two very large areas of learning. When we ask students new to research to do literature reviews in their first semester of study, as is common for PhD students, many are unprepared for either task. Research rigor is the strength of a study's design and methods to answer its research questions with certainty. The degree of rigor with which a study was conducted is displayed in the methods section of a research report. How do seasoned scientists appraise research rigor? They use cognitive skills similar to those used by clinical experts in assessing a familiar type of patient. While reading a report, seasoned scientists start by recognizing salient facts, then seeing patterns of consistent or inconsistent facts, then judging the relevance of these patterns in the context of a given study and eventually across the wider context of this topical area, finally grasping the strength of evidence as a whole and what more needs to be known. If nursing editors can be considered experts in reading nursing science, I offer my own process as an example. Considering sampling adequacy, for example, when I read the research questions to be answered in a research report, I am already thinking of the nature of the sample that will be needed to answer them. I unthinkingly apply years of experience to quickly locate information about how sampling was done and the sample it produced. That information leads me to raise further questions about the sample's adequacy that I try to answer as I go through the rest of the report. I am left with a sense of satisfaction or of concern about how well that sample served to meet the research purpose at hand, in the context of similar reports in the field, along with a list of questions for the authors to further flesh out my mental picture of the study. In contrast, when I explore a report from a very different discipline, which I have the opportunity to do as the graduate dean of my university, I am sent back to the novice end of the continuum. In reading a math or philosophy dissertation, I first need to find and recognize the salient facts that I should attend to, and then I need to grasp what constitutes a meaningful pattern of facts, how these patterns were discovered and tested in the methods of the discipline, and whether the author considered the present pattern common or uncommon and important or unimportant in the context of the larger field. I am dependent on the author and my general life experience to appraise the overall merit of the work being described. This is much like what a novice research student encounters in reading an article for a first literature review. How do students learn to appraise rigor in research reports, so as to be able to synthesize across studies? Returning to sampling, students may learn the factual definitions of random sampling, quota sampling, stratified random sampling, power, and power analysis, but they may not be adept in recognizing this information in a typical research article, let alone judging its adequacy in that context. Too often, these and other research design principles are taught in isolation from real-world published examples, although students may be assigned to do an in-depth critique of a single research article. This isolation of theory from practice is similar to the single-patient assignment of a new nursing student. He or she is struggling to recognize important clinical signs and apply principles learned in class to the care of this patient, but without ever having seen other patients with this problem or its co-morbidities. As Benner (1984) depicted, the novice does not even know in which direction to look when confronted with many equally compelling clues. Students need progressive learning about research design and methods through guided immersion, much as Benner and Brooks described. Both would recommend plenty of exposure to a variety of articles of different kinds, to move students from puzzling through an undifferentiated sea of facts to grasping meaningful patterns, salient exceptions, and gaps in the evidence. To help students begin to appraise sampling and samples, for example, a hypothetical assignment would be to find 10 research reports on a topic of interest and highlight all the content related to sample selection and description. Finding the relevant information may not be straightforward because it may be strewn through the methods and results sections or beyond. Then, the next assignment would be to apply principles learned in class and judge the relative adequacy of each sample for the work described in that report. Then, it is time for synthesis- moving from facts to knowledge. Having spent weeks with this set of articles, students can be asked to summarize the types of samples and sampling approaches in this collection as a whole, with an emphasis on the plural of types and approaches because there is likely more than one discernable pattern. Tables of information can be useful for seeing clusters or patterns across the studies under consideration, and such tables are common in published reviews of the literature. Distilling down the information in each cell is a cognitive exercise that strengthens the ability to distinguish the important from the unimportant. Intensive feedback is invaluable as these judgments are made. Once the key elements of the sample of each study are lined up in a table, how these samples look as a whole can be characterized in nuanced language of patterns and exceptions. The last steps are to make a judgment about the adequacy of sampling and samples in this area to meet an important research goal, and to propose steps to strengthen sampling in future research toward that goal. But students cannot make judgments about sample adequacy until they know how to appraise all the other components of a study. A sample that is inadequate in one context may be exactly what is needed in another. For example, maxims about large and small samples cannot be applied across the board. Students cannot judge the adequacy of a sample to provide trustworthy evidence until they also are able to appraise that study's aims, data collection, analysis, reliability, and validity. So, each component of research design must be studied in theory and then repeatedly applied in supervised practice, first to individual studies and then to groups of studies. The same cluster of articles of interest can be used over and over again for these exercises. By the end of such a course, the student will be well on the way to a synthesis of the adequacy of research designs in that article cluster. Along with being able to synthesize the collective methodological strengths and weaknesses of a body of work depicted in a group of research reports, one needs to be able to synthesize what is known from those reports about the research problem of interest and the solutions tested to date. To achieve this second synthesis, one needs to move from facts on the clinical problem to a conceptual level of knowledge of the problem in context. Conceptual thinking about a clinical problem requires another novice-to-expert progression, but progress may be expedited when a student is immersed in the relevant clinical world. The novice researcher who is a competent or proficient clinician is likely to be able to visualize the clinical context of a study, the seriousness of illness and life complexities of the sample, and the clinical and personal exigencies that affect those individuals' progress and outcomes. This clinician may be frustrated at the lack of clinical detail and may be critical of the researchers' apparent disregard of important clinical covariates. These sensitivities are important but must be balanced by a new kind of conceptual understanding of how clinical problems and solutions are operationalized in research designs in a given discipline. For example, an oncology nurse interested in depression and its treatment in patients with advanced cancer may be well-acquainted with the discouraging trajectory of failed cancer treatments and symptom progression, have seen what depression looks like in these patients, and know the commonly used antidepressant medications, but may well be unfamiliar with the conceptual differences between mood, depressive symptoms, and depression; the evidence on interaction of depression with physical aspects of quality of life; the characteristics and relevance of the instruments used to measure depression in research; or the conceptual underpinnings and operationalization of depression treatment modalities. Moving from a clinical to a conceptual grasp of research on depression would be an essential first step. Moving to conceptual thinking enables the clinician to cluster and appraise research information in a new way. He or she will then be able to synthesize in separate sections of a paper what is known about the characteristics and correlates of mood disturbances in subsets of the advanced cancer population as measured with various tools, the strengths and shortcomings of those tools to capture relevant domains of experience in this population, the conceptual and practical reasoning behind selection of various treatment modalities for intervention testing, the fidelity with which those treatments have adhered to the conceptual underpinnings of the treatment approach, and so forth. Then he or she is positioned to comment on gaps in the literature and areas in need of further study. Offering a stand-alone course on research methods and another course on the theory of a discipline and perhaps a third course on literature review may be a slow route from facts to knowledge. An integrated approach, in which concepts and designs can be studied in depth as driving specific studies in a topic of interest, combined with guided and informed immersion in student-relevant research literature, may enable faster progression to research synthesis. For example, what if the three instructors who ordinarily teach the separate introductory courses named above instead offered three courses on appraising and synthesizing research, each covering a different focus within their expertise, such as population health, symptom management, or health systems? This kind of introductory course would differ from a course surveying the research evidence in a topic area. The instructor would teach how concepts and theories work in general, as well as in guiding that body of research, and would convey both the available range of study designs and methods and how they have or have not yet been applied in that field of interest. Perhaps you took such a course, or teach one. If so, let me know how it worked! The student would be immersed in the research designs and conceptual underpinnings of the instructor's area of mastery, acquiring cognitive skills that can be leveraged in other content areas while building and synthesizing a literature base in an area of personal interest. The instructor might model synthesis by thinking aloud, voicing a cognitive progression from the facts about a given study to how they relate to the other studies on hand and the larger phenomena being pursued in that field of interest. The wisdom of the expert enables him or her to playfully draw comparisons and thoughtfully invent “what-if” scenarios in response to students' questions. Perhaps some research designs would receive more emphasis than others in a given course, based on the phenomenon of interest to the instructor, but this narrowed scope may be warranted as a faster route to the research readiness in the student's chosen area. Maybe every student must start the literature review process with “Smith and Jones found X and Brown and Chen found Y,” but every student can progress from that point to pattern recognition and appraisal. Whether in an integrated course such as described above or in another context, students should be exposed to literature reviews in which study descriptions are not ends in themselves but serve as examples to support well-substantiated, nuanced claims about the state of the science. Keeping the novice-to-expert and facts-to-wisdom trajectory in mind will help us facilitate movement toward synthesis in novice research writers. These models remind us that immersion in many examples, accompanied early on by engaged help to recognize core principles in those examples, is the route to competence and knowledge.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call