Abstract

BackgroundAccurate determination of low-density lipoprotein cholesterol (LDL) is important for coronary heart disease risk assessment and atherosclerosis. Apart from direct determination of LDL values, models (or equations) are used. A more recent approach is the use of machine learning (ML) algorithms. MethodsML algorithms were used for LDL determination (regression) from cholesterol, HDL and triglycerides. The methods used were multivariate Linear Regression (LR), Support Vector Machines (SVM), Extreme Gradient Boosting (XGB) and Deep Neural Networks (DNN), in both larger and smaller data sets. Also, LDL values were classified according to both NCEP III and European Society of Cardiology guidelines. ResultsThe performance of regression was assessed by the Standard Error of the Estimate. ML methods performed better than established equations (Friedewald and Martin). The performance all ML methods was comparable for large data sets and was affected by the divergence of the train and test data sets, as measured by the Jensen-Shannon divergence. Classification accuracy was not satisfactory for any model. ConclusionsDirect determination of LDL is the most preferred route. When not available, ML methods can be a good substitute. Not only deep neural networks but other, less computationally expensive methods can work as well as deep learning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call