Abstract

The use of machine learning in weather prediction is growing rapidly as an alternative to conventional numerical weather prediction. However, predictions using machine learning such as Long Short Term Memory (LSTM) based on neural networks have weaknesses in predicting extreme events with a high ratio of unbalanced data. This research examines the performance of using focal loss in LSTM to obtain a machine-learning model that is cost-sensitive. The model used the Global Forecasting System Data and the Global Satellite Measurement of Precipitation for the years 2017-2020. Testing the hyperparameter configuration was carried out using the hyperband method on the number of nodes and the number of iterations with 3 scenarios (2, 3, and 4 classes). The results showed an increased performance against noncost sensitive LSTM with an average increase of 25% accuracy and 11% F1-score on 2 classes scenario, 15% accuracy increase and 21% F1-score for scenario 3 classes, as well as an increase in accuracy of 15% and F1-score 26% for scenario 4 class. It also provides the idea of how cost-sensitive properties can help machine learning models detect classes with extreme ratios, based on an increase in average performance as the number of classification scenarios increases.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call