Abstract
Background:Evaluating applications for multi-national, multi-disciplinary, dual-purpose research consortia is highly complex. There has been little research on the peer review process for evaluating grant applications and almost none on how applications for multi-national consortia are reviewed. Overseas development investments are increasingly being channelled into international science consortia to generate high-quality research while simultaneously strengthening multi-disciplinary research capacity. We need a better understanding of how such decisions are made and their effectiveness. Methods:An award-making institution planned to fund 10 UK-Africa research consortia. Over two annual rounds, 34 out of 78 eligible applications were shortlisted and reviewed by at least five external reviewers before final selections were made by a face-to-face panel. We used an innovative approach involving structured, overt observations of award-making panel meetings and semi-structured interviews with panel members to explore how assessment criteria concerning research quality and capacity strengthening were applied during the peer review process. Data were coded and analysed using pre-designed matrices which incorporated categories relating to the assessment criteria. Results:In general the process was rigorous and well-managed. However, lack of clarity about differential weighting of criteria and variations in the panel's understanding of research capacity strengthening resulted in some inconsistencies in use of the assessment criteria. Using the same panel for both rounds had advantages, in that during the second round consensus was achieved more quickly and the panel had increased focus on development aspects. Conclusion:Grant assessment panels for such complex research applications need to have topic- and context-specific expertise. They must also understand research capacity issues and have a flexible but equitable and transparent approach. This study has developed and tested an approach for evaluating the operation of such panels and has generated lessons that can promote coherence and transparency among grant-makers and ultimately make the award-making process more effective.
Highlights
The use of peer review by expert panels is a well established method for assessing scientific research and for evaluating grant applications (Abdoul et al, 2012; Coryn et al, 2007; Lawrenz et al, 2012; Wooten et al, 2014)
Except for Lamont’s (2009) in-depth work and a recent review (Guthrie et al, 2017), little research has been conducted into the peer review process of grant applications; the few studies that have been conducted focused on individual research projects (Abdoul et al, 2012) and national research collaborations (Klein & Olbrecht, 2011)
This study primarily used structured, overt observations and semi-structured interviews to explore the peer review process used by an award-making institution to select ten United Kingdom (UK)-Africa natural science research consortia over two annual rounds
Summary
The use of peer review by expert panels is a well established method for assessing scientific research and for evaluating grant applications (Abdoul et al, 2012; Coryn et al, 2007; Lawrenz et al, 2012; Wooten et al, 2014). Such research consortia usually comprise research institutions in high-income countries (HIC) and low- and middle-income countries (LMIC) Such consortia generally aim to generate innovative science through world class research and to strengthen research capacity at the individual, institutional and national/international level. Overseas development investments are increasingly being channelled into international science consortia to generate high-quality research while simultaneously strengthening multi-disciplinary research capacity. We used an innovative approach involving structured, overt observations of award-making panel meetings and semi-structured interviews with panel members to explore how assessment criteria concerning research quality and capacity strengthening were applied during the peer review process. Conclusion: Grant assessment panels for such complex research version 3 (revision)
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.