Abstract

The infrared (IR) spectra of whole blood EDTA samples, in the range between 1500 and 750 cm−1, obtained from the patient population of a general hospital, were used to compare different multivariate calibration techniques for quantitative glucose determination. Ninety-six spectra of whole undiluted blood samples with glucose concentration ranging between 44 and 291 mg/dL were used to create calibration models based on a combination of partial least-squares (PLS) and artificial neural network (ANN) methods. The prediction capabilities of these calibration models were evaluated by comparing their standard errors of prediction (SEP) with those obtained with the use of PLS and principal component regression (PCR) calibration models in an independent prediction set consisting of 31 blood samples. The optimal model based on the combined PLS-ANN produced smaller SEP values (15.6 mg/dL) compared with those produced with the use of either PLS (21.5 mg/dL) or PCR (24.0 mg/dL) methods. Our results revealed that the combined PLS-ANN models can better approximate the deviations from linearity in the relationship between spectral data and concentration, compared with either PLS or PCR models.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.