Abstract

274 Background: Quality assurance (QA) in clinical trials is a safeguard against non-compliance, which impacts patient safety and data integrity. On an institutional level, compiling quality findings provides insight to gaps within existing processes, protocol compliance, and educational content. A QA metrics methodology has been implemented, which is streamlined with the ICH-GCP, and institutional Standard Operating Procedures (SOPs) and policies. Methods: Individual findings are assigned an alphanumerical code, based on severity and category, and a reference for each Quality Assurance Review (QAR). Annual data is tracked for all QAR findings. Different data sets are created to allow for analysis of quality gaps and quality changes over time. Results are used in the creation or revision of educational content, SOPs, and to drive process improvement. Results: To date, a total of 1608 QAR observations between 2014 and mid-2015 have been coded and tracked across 22 studies. Areas of quality gaps, such as most-cited categories (e.g. 27% in Source Documentation and 13% in Regulatory) and references (e.g. 30% on SOPs and 14% on guidelines), are communicated to the Quality and Education team regularly and incorporated into training content. This has also prompted the development of new SOPs, processes, and research tools, after which QA metrics continues to be used to monitor program wide quality improvement. For example, following the implementation of Electronic Source Documentation, the proportion of findings on delays in Adverse Event sign off has seen a decline. For individual QARs, a personalized trends summary is provided to the study team with an overview of the quality of the study conduct. The report displays the distribution of findings across different categories and severity levels. Lastly, quality metrics has increased the efficiency of tracking and reporting program wide QA activity. Conclusions: The regular analysis of quality metrics has proven to be a pivotal step in the Quality Management Cycle. It presents a quick snap shot of the quality of individual research studies under review. When implemented on an institutional scale, it offers valuable feedback on the current SOPs, processes, and training content.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call