Abstract

The grid search approach was used in this article to tune the hyperparameters of the Light Gradient Boosting Machine, Random Forest, Gradient Boosting, Extra Tree, Ada Boost, and Linear Discriminant Analysis algorithms for Gamma and Hadron classification. The findings of ROC and Precision-Recall curves were also discussed to assess the performance of algorithms in the Gamma and Hadron classification. with the light gradient boosting machine, it took 33 s to get an AUC value of 0.94 for the Gamma and Hardon classification. Also, the results of the Random Forest, the light gradient boosting machine, and the Linear Discriminant Analysis were all about the same.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call