Abstract

<p>Machine Learning (ML) algorithms are used to learn from data and make data-driven predictions. These algorithms consider pattern recognition and computational learning on the data. Earth Orientation Parameters (EOP) are the monitoring parameters for the Earth’s rotation. UT1-UTC is an EOP that monitors the time required by the Earth to complete a rotation versus atomic time. This parameter is indispensable for many applications like precise satellite orbit determination, interplanetary space navigation, etc. In this study, we will use novel ML algorithms to predict the UT1-UTC (IERS C04) time series and investigate its performance against each other and also, w.r.t. the conventional prediction fitting methods, like Least Squares (LS), Auto-Regressive (AR), Multivariate Autoregressive (MAR) methods, etc. In this work, a diversity of advanced ML algorithms will be tested: Random Forest (RF), Generalized Linear Model (GLM), Gradient Boosted Model (GBM), K-means and prophet algorithms. We would like to optimize the UT1-UTC prediction technique to work well with the short-term prediction up to 10 days. Finally, these ML predictions will be compared against those from the last Earth Orientation Parameters Prediction Comparison Campaign (EOP PCC) from October 1, 2005 to February 28, 2008. This detailed study would be useful to understand the performance of ML techniques on the UT1-UTC time series and would lead to further development of better prediction models using ML algorithms. </p><p>Key words: Machine Learning, Earth Orientation Parameters (EOP), UT1-UTC, predictions</p>

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call