Abstract

Tuning the model hyperparameters is an essential step in developing a high-performance machine learning model. Grid search algorithms and random search algorithms are used in machine learning to tune the hyperparameters of ML algorithms. Ensemble learners are a category of a machine learning algorithm. Ensemble classifiers are divided into two types: bagging, which is a parallel ensemble model, and boosting, which is a sequential ensemble model. The proposed work uses two boosting classifiers, the Adaboost Algorithm and Gradient boosting algorithm, and one bagging classifier, the Random forest algorithm. A model for early heart disease prediction has developed using the Adaboost classifier, random forest, and gradient boosting classifier. The Cleveland heart disease dataset is used to train and validate the ensemble classifiers in this heart disease prediction model. When comparing the performance of these ensemble learners, gradient boosting algorithms outperform AdaBoost and random forest classifiers. This paper evaluated the efficiency of the grid search algorithm and random search algorithm via tuning the hyperparameters of the Gradient boosting algorithm, Adaboost algorithm, and Random forest algorithm. We conclude from the performance analysis, tuning the ensemble classifier hyperparameters using the grid search method and the random search method can increase an ensemble learner's efficiency.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call