Abstract
BackgroundThere is uncertainty whether increased frequency of calibrations may affect the overall analytical variability of a measurement procedure as reflected in quality control (QC) performance. In this simulation study, we examined the impact of calibration frequencies on the variability of laboratory measurements. MethodsA 5-point calibration curve was modeled with simulated concentrations ranging from 10 to 10,000 mmol/l, and signal intensities with CVs of 3 % around the mean, under a Gaussian distribution. 3 levels of QC (20, 150, 600 mmol/l) interspersed within the analytical measurement range were also simulated. ResultsThe CV of the 3 QC levels remained stable across the different calibration frequencies simulated (5, 10, 15 and 30 QC measurements per recalibration episode). The imprecision was greatest (18 %) at the lowest concentration of 20 mmol/l, when the calibration curve was derived using ordinary least squares regression, reducing to 3.5 % and 3.8 % at 150 and 600 mmol/l, respectively. The CV of all 3 QC concentrations remained constant at 3.4 % and close the predefined CV (3 %) when weighted least squares regression was used to derive the calibration model. Similar findings were observed with 2-point calibrations using WLS models at narrower concentration ranges (50 and 100 mmol/l as well as 50 and 500 mmol/l). DiscussionWithin the parameters of the simulation study, an increased frequency of calibration events does not adversely impact the overall analytical performance of a measurement procedure under most circumstances.
Submitted Version (Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have