Abstract

Trees in proximity to power lines can cause significant damage to utility infrastructure during storms, leading to substantial economic and societal costs. This study investigated the effectiveness of non-parametric machine learning algorithms in modeling tree-related outage risks to distribution power lines at a finer spatial scale. We used a vegetation risk model (VRM) comprising 15 predictor variables derived from roadside tree data, landscape information, vegetation management records, and utility infrastructure data. We evaluated the VRM’s performance using decision tree (DT), random forest (RF), k-Nearest Neighbor (k-NN), extreme gradient boosting (XGBoost), and support vector machine (SVM) techniques. The RF algorithm demonstrated the highest performance with an accuracy of 0.753, an AUC-ROC of 0.746, precision of 0.671, and an F1-score of 0.693. The SVM achieved the highest recall value of 0.727. Based on the overall performance, the RF emerged as the best machine learning algorithm, whereas the DT was the least suitable. The DT reported the lowest run times for both hyperparameter optimization (3.93 s) and model evaluation (0.41 s). XGBoost and the SVM exhibited the highest run times for hyperparameter tuning (9438.54 s) and model evaluation (112 s), respectively. The findings of this study are valuable for enhancing the resilience and reliability of the electric grid.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call