Abstract

We read with interest the article written by Lin et al.1 in which the authors described clearly the challenges of interpreting evidence from reviews of multicomponent interventions designed to prevent functional decline in elderly persons. Because we also struggled recently with the collection of high-quality evidence from complex multicomponent interventions to improve care for older adults,2 we support the proposal by Lin et al. to establish a standardized core set of measures and criteria to perform and report the results of research regarding complex interventions but how this might be accomplished remains an important question. Lin et al.1 presented a framework with important questions that must be answered to understand the heterogeneity in clinical trials and—related to these questions—provide considerations for future research regarding the health and functional decline of older adults. In this respect, introducing “core clinical outcomes” in a given research area would likely facilitate comparisons and thereby enhance the ability to make informed decisions and policies. For example, in the Netherlands, this strategy is currently implemented in all research projects within the National Care for the Elderly Program, requiring the collection of data from the Minimum Data Set (MDS) to compare the project results nationally.3 In addition, researchers attempt to develop a meaningful composite outcome measure from this MDS to evaluate complex interventions for treating elderly people with complex needs.4 One question that Lin and colleagues did not explicitly address—and that in our opinion is essential for comparing trials with similar complex interventions—is the incorporation of process evaluations. Process evaluations facilitate the interpretation of outcome results by documenting and evaluating the mechanisms and processes in detail.5 Process evaluations are being published at an increasing rate; entering the search term “process evaluation” in PubMed retrieves 33 articles that were published in 2000, 58 in 2005, and 145 in 2012. Process evaluations can be used for several purposes, which reflect the challenges that Lin and colleagues describe. For example, Reelick and colleagues developed a useful framework to gain insight into the recruitment success rates and characteristics of study populations, the execution of complex interventions, and the acquisition of data.6 This framework can be used for process evaluations to overcome challenges in understanding targeted populations and complex interventions. Leontjevas and colleagues demonstrated the value and importance of process evaluations in determining which statistical effect analysis should be performed to best understand internal and external validity, because this validity can influence the presentation and interpretation of effects.7 Thus, process evaluations can also be used to overcome challenges in performing outcome analyses. Moreover, although qualitative process evaluations are rarely included in randomized trials of complex healthcare interventions, these evaluations can be valuable, for example to examine whether the intervention was delivered as intended, to investigate processes of implementation and changes, to explore the experiences of important actors with the intervention, and to provide data that can be used to interpret the findings and help explain variations in effectiveness.8 Thus, if process evaluations are to be included, how should they be structured to limit heterogeneity in their results? To perform process evaluations in a given subject area, a set of “core process measures” for performing process evaluations and a set of “core clinical outcomes” are needed. For example, core process measures could be based on components of the Normalization Process Theory, which is particularly useful for understanding (the degree of) implementation, embedding, and integration of innovative complex interventions in healthcare organizations.9 In addition, process evaluations often describe implementation fidelity as the degree to which an intervention affects outcomes.10 Therefore, degree of implementation should be considered an effect outcome measure composed of quantitative and qualitative data. Providing details regarding the complex processes of conducting trials of multicomponent interventions in (frail) elderly adults enables researchers and policy-makers to determine the value of these complex interventions. Process evaluations should be an explicit and preexisting component when designing trials of complex interventions; specifically, these evaluations should incorporate “core process measures” that describe the degree of implementation of the intervention in the trial, should be published alongside the results of their effect studies, and should be incorporated into systematic reviews. Research and guidelines regarding which core process measures should be used in geriatric research are needed. Conflict of Interest: The editor in chief has reviewed the conflict of interest checklist provided by the authors and has determined that the authors have no financial or any other kind of personal conflicts with this paper. Author Contributions: Bakker F.C.: drafting and editing the letter. Persoon, Reelick, Van Munster, Hulscher, Olde Rikkert: editing the letter. Sponsor's Role: None.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call