Abstract

The distribution network line loss rate is a crucial factor in improving the economic efficiency of power grids. However, the traditional prediction model has low accuracy. This study proposes a predictive method based on data preprocessing and model integration to improve accuracy. Data preprocessing employs dynamic cleaning technology with machine learning to enhance data quality. Model integration combines long short-term memory (LSTM), linear regression, and extreme gradient boosting (XGBoost) models to achieve multi-angle modeling. This study employs regression evaluation metrics to assess the difference between predicted and actual results for model evaluation. Experimental results show that this method leads to improvements over other models. For example, compared to LSTM, root mean square error (RMSE) was reduced by 44.0% and mean absolute error (MAE) by 23.8%. The method provides technical solutions for building accurate line loss monitoring systems and enhances power grid operations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call