Abstract

The World Health Organization (WHO) recommends the clinical use of a human immunodeficiency virus 1 (HIV-1) viral load (VL) threshold level of 1000 copies (cp)/mL in patients on antiretroviral therapy (ART) to distinguish between viral control (VL < 1000 cp/mL) and viral failure or poor adherence (VL > 1000 cp/mL). The accuracy of five quantitative HIV-1 RNA assays at this level was compared by replicate testing (n = 24) of 1000 cp/mL samples prepared from the Viral Quality Control (VQC) HIV-1 subtype B standard, which is in use for validation of nucleic acid testing methods since 1995. Until 2004 the VL assays reported geometric mean (95% confidence interval [CI]) values ranging between 449 (188-1067) and 3162 (3057-2367) cp/mL when using the Siemens bDNA 3.0 assay as reference method for an assigned value of 1000 (962-1038) cp/mL. In 2018, the following values (95% CI) were found by 24 replicate tests in each of the VL assays on the 1000 cp/mL samples: Abbott RealTime 1084 (784-1572), BioMerieux EasyQ 1110 (533-2230), Roche CAP/CTM 1277 (892-1828), Hologic Aptima 1616 (1324-1973), and Cepheid GeneXpert 2502 (1713-3655) cp/mL. Calibration studies involving three consecutive WHO replacement standards showed a significant drift in the amount of RNA copies per International Unit overtime. Heat inactivation of HIV-1 standards was found to cause a destandardizing effect. Our study underlines the limitations in HIV-1 RNA assay calibration based on frequently replaced WHO international standards. It is therefore proposed that clinicians interpret the recommended 1000 cp/mL alert level in therapy monitoring with an inaccuracy range of 500 to 2000 cp/mL.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call