Abstract

DNA sequencing continues to evolve quickly even after >30years. Many new platforms suddenly appeared and former established systems have vanished in almost the same manner. Since establishment of next-generation sequencing devices, this progress gains momentum due to the continually growing demand for higher throughput, lower costs and better quality of data. In consequence of this rapid development, standardized procedures and data formats as well as comprehensive quality management considerations are still scarce. Here, we listed and summarized current standardization efforts and quality management initiatives from companies, organizations and societies in form of published studies and ongoing projects. These comprise on the one hand quality documentation issues like technical notes, accreditation checklists and guidelines for validation of sequencing workflows. On the other hand, general standard proposals and quality metrics are developed and applied to the sequencing workflow steps with the main focus on upstream processes. Finally, certain standard developments for downstream pipeline data handling, processing and storage are discussed in brief. These standardization approaches represent a first basis for continuing work in order to prospectively implement next-generation sequencing in important areas such as clinical diagnostics, where reliable results and fast processing is crucial. Additionally, these efforts will exert a decisive influence on traceability and reproducibility of sequence data.

Highlights

  • The initial sequencing methods were developed by Maxam and Gilbert as well as Sanger and Coulson with the latter being almost the only method in use for N30 years (Hutchison, 2007; Schuster, 2008)

  • next-generation sequencing (NGS) originated from the US and shows the broadest distribution there, the overwhelming majority of standardization efforts are based overseas

  • It became obvious that the current focus relies on NGS standardization in clinical diagnostics due to highest demands and requirements regarding quality control (QC) and data reliability in this area

Read more

Summary

Introduction

The initial sequencing methods were developed by Maxam and Gilbert as well as Sanger and Coulson with the latter being almost the only method in use for N30 years (Hutchison, 2007; Schuster, 2008). The QA program should contain QC methods for contamination identification at several stages within the sequencing workflow These stages comprise the initial sample evaluation, the fragmentation step, the final library assessment, the monitoring of error rates during the sequencing process and the raw data analysis with focus on read quality (Rehm et al, 2013). The CAP NGS Work Group works on means of quality documentation, but in a broader context, overarching general QA They developed 18 laboratory accreditation checklist requirements for upstream analytic processes and downstream bioinformatics solutions for NGS in clinical applications (Aziz et al, 2015). The MOL topics for the wet bench process are summarized in the Appendix (see Appendix A — Table A.2)

Guidelines for validation of sequencing workflows in clinical applications
Standardization efforts from organizations and companies
Standard proposals for general sequencing workflows
Standard proposals for sample preparation step
Spike-in controls for downstream quality evaluation
The impact and classification of sequencing errors
Downstream bioinformatics pipeline and data analysis
Data submission requirements and standards
Findings
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call