Objective: Blood glucose levels in diabetes are influenced by various factors, making prediction challenging when considering many variables. Therefore, this study proposes an optimized LSTM model to improve the prediction accuracy of blood glucose using a single variable (CGM), thereby simplifying diabetes management. Methods: This study has implemented the LSTM model to predict blood glucose levels in type-1 diabetics for a 60-minute prediction horizon. The grid search technique has been implemented to fine-tune the hyperparameters of the LSTM model. Ohio Datasets were used for this study. Root mean squared error (RMSE) and error grid analysis (EGA) were used for the evaluation of model performance. Findings: The proposed study reveals 26.13 ± 3.25 mg/dl predicting accuracy for a 60-minute prediction horizon. Clark Error grid analysis (CEGA) was used to validate clinical acceptance of the proposed model, and the results provided higher than 97% prediction accuracy in the Ohio dataset. Novelty: The patient-specific optimized model has outperformed the conventional LSTM model. The results show a significant improvement in model performance compared to previous work in terms of root mean square error (RMSE). Keywords: Long-short term memory (LSTM), Grid search optimization technique, Continuous Glucose monitors (CGM), Blood Glucose, Type-1 Diabetes
Read full abstract- All Solutions
Editage
One platform for all researcher needs
Paperpal
AI-powered academic writing assistant
R Discovery
Your #1 AI companion for literature search
Mind the Graph
AI tool for graphics, illustrations, and artwork
Unlock unlimited use of all AI tools with the Editage Plus membership.
Explore Editage Plus - Support
Overview
94172 Articles
Published in last 50 years
Articles published on Root Mean
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
92710 Search results
Sort by Recency