Abstract

The advent of machine learning has made a remarkable impact in the field of healthcare. Diabetes mellitus is a metabolism abnormality that is posing severe threat, exercising substantial pressure on human health worldwide. Diabetes mellitus is a public health problem around the world. In 1980, 108 million adults worldwide had diabetes. By 2040 the number is expected to reach 642 million adults. Hence extensive research in interdisciplinary field that uses skills from various fields such as statistics machine learning, artificial intelligence, visualization, etc. is carried out for better management of diabetes. In this paper, the focus is to use time series forecasting algorithms. Data-driven models in time series machine learning are used to derive meaningful and appropriate information from large volumes of blood glucose level and related data for precise forecasting of upcoming blood glucose level fluctuations. Not only can the patient and physician be informed beforehand, to avert complications, but also it aids in predicting response to certain medications with ease. In this case, univariate data-driven models from time series machine learning algorithms are implemented on 2 different continuous glucose monitoring sensor datasets: Libre Pro dataset of 10 patients and Ohio T1DM dataset of 6 patients. A comparison of performance evaluation metrics of the different time series machine learning algorithms is drawn based on root mean squared error (RMSE), mean average percentage error (MAPE) and Theil’s U, which are statistical analyses, and Clarke’s error grid, which is clinical analysis for prediction horizon from 15 to 45 min. Using Holt’s Linear AAN Algorithm on Libre Pro dataset with alpha and beta of 0.99 provided the least error among exponential smoothing algorithms with RMSE of 7.98 mg/dl for 15 min, 19.47 mg/dl for 30 min and 28.40 mg/dl for 45 min prediction horizon. Theil’s U coefficient was 0.12 for 15 min, 0.39 for 30 min and 0.72 for 45 min prediction horizon. Autoregressive Integrated Moving Average (ARIMA) Algorithm gave the best performance evaluation results with RMSE of 7.07 mg/dl for 15 min with a MAPE of 3.98. The performance results were on par when these algorithms were tested on Ohio T1DM dataset. ARIMA Algorithm gave the best performance evaluation results with RMSE of 13.14 mg/dl for 15 min with a MAPE of 8.213. The difference in the error coefficient for Ohio dataset was due to missing data.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.