Abstract

Next-generation sequencing (NGS) is getting routinely used in the diagnosis of hereditary diseases, such as human cardiomyopathies. Hence, it is of utter importance to secure high quality sequencing data, enabling the identification of disease-relevant mutations or the conclusion of negative test results. During the process of sample preparation, each protocol for target enrichment library preparation has its own requirements for quality control (QC); however, there is little evidence on the actual impact of these guidelines on resulting data quality. In this study, we analyzed the impact of QC during the diverse library preparation steps of Agilent SureSelect XT target enrichment and Illumina sequencing. We quantified the parameters for a cohort of around 600 samples, which include starting amount of DNA, amount of sheared DNA, smallest and largest fragment size of the starting DNA; amount of DNA after the pre-PCR, and smallest and largest fragment size of the resulting DNA; as well as the amount of the final library, the corresponding smallest and largest fragment size, and the number of detected variants. Intriguingly, there is a high tolerance for variations in all QC steps, meaning that within the boundaries proposed in the current study, a considerable variance at each step of QC can be well tolerated without compromising NGS quality.

Highlights

  • Before the advent of next-generation sequencing (NGS), genetic testing was realized by Sanger sequencing [1], which meant analyzing a gene exon-wise or amplicon-wise in a relatively elaborate, time-consuming and costly way

  • While it is broadly appreciated that post-processing of sequencing data is inevitable, less certainty exists on the influence of wet-lab steps during library preparation on the final quality of variant calls

  • We first examined the statistical distributions of all assessed quality control (QC) parameters over a set of 581 patient samples undergoing SureSelect target enrichment

Read more

Summary

Introduction

Before the advent of next-generation sequencing (NGS), genetic testing was realized by Sanger sequencing [1], which meant analyzing a gene exon-wise or amplicon-wise in a relatively elaborate, time-consuming and costly way. One major step in this path is the first marketing authorization for an NGS instrument (Illumina’s MiSeqDx) by the Food and Drug Administration of the United States (US FDA) [4] Besides such optimism, less certainty exists on the required standards for ensuring sequencing quality. For gene panel or target enrichment, a number of distinct protocols based on, e.g., PCR, hybridization, or selective circularization, have been developed [5] For each of these methods, stringent quality control (QC) steps were introduced to ensure a consistent data quality of the resulting NGS process. It is virtually unknown how QC could affect the sequencing process in case of abnormal results obtained

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.