Abstract

It has been assumed that continuous glucose sensors show substantial time lags versus blood glucose. This assumption has led to suggestions that sensors are less accurate during rapidly changing glucose levels and that sensors should only be calibrated when glucose levels are stable. The analysis presented here tests the assumption of substantial sensor time lag and its suggested effects using clinical data from the DexCom (San Diego, CA) SEVEN. Sensor and blood glucose data were collected from 117 adult subjects with insulin-dependent diabetes. Each subject wore the sensor for 7 days and underwent an 8-10-h in-clinic tracking study during which blood glucose was measured every 15-20 min. Accuracy (absolute relative difference [ARD]) versus blood glucose rate of change was evaluated on the in-clinic data set. The effect on accuracy of calibration during rapid rates of change was evaluated on the combined home-use and in-clinic data set. Average sensor time lag versus blood glucose was 5.7 min. Mean ARD versus rate of change (less than -2 to >2 mg/dL/minute) ranged between 15.0% to 16.3%. Across rates of change during calibration, mean ARD after calibration ranged between 13.2% and 16.0%. Calibration with reference measurements instead of patient measurements improved overall mean ARD from 16.0% to 8.5%. For this sensor, the assumption of substantial time lag and its suggested effects may be incorrect. The main source of error is the calibration process.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call