Low-Earth-orbit (LEO) satellites are widely acknowledged as a promising infrastructure solution for global Internet of Things (IoT) services. However, the Doppler effect presents a significant challenge in the context of long-range (LoRa) modulation uplink connectivity. This study comprehensively examines the operational efficiency of LEO satellites concerning the Doppler weather effect, with state-of-the-art artificial intelligence techniques. Two LEO satellite constellations-Globalstar and the International Space Station (ISS)-were detected and tracked using ground radars in Perth and Brisbane, Australia, for 24 h starting 1 January 2024. The study involves modelling the constellation, calculating latency, and frequency offset and designing a hybrid Iterative Input Selection-Long Short-Term Memory Network (IIS-LSTM) integrated model to predict the Doppler weather profile for LEO satellites. The IIS algorithm selects relevant input variables for the model, while the LSTM algorithm learns and predicts patterns. This model is compared with Convolutional Neural Network and Extreme Gradient Boosting (XGBoost) models. The results show that the packet delivery rate is above 91% for the sensitive spread factor 12 with a bandwidth of 11.5 MHz for Globalstar and 145.8 MHz for ISS NAUKA. The carrier frequency for ISS orbiting at 402.3 km is 631 MHz and 500 MHz for Globalstar at 1414 km altitude, aiding in combating packet losses. The ISS-LSTM model achieved an accuracy of 97.51% and a loss of 1.17% with signal-to-noise ratios (SNRs) ranging from 0-30 dB. The XGB model has the fastest testing time, attaining ≈0.0997 s for higher SNRs and an accuracy of 87%. However, in lower SNR, it proves to be computationally expensive. IIS-LSTM attains a better computation time for lower SNRs at ≈0.4651 s, followed by XGB at ≈0.5990 and CNN at ≈0.6120 s. The study calls for further research on LoRa Doppler analysis, considering atmospheric attenuation, and relevant space parameters for future work.
Read full abstract