Abstract

Low Earth Orbit (LEO) satellite constellations have seen a surge in deployment over the past few years by virtue of their ability to provide broadband Internet access as well as to collect vast amounts of Earth observational data that can be utilized to develop AI on a global scale. As traditional machine learning (ML) approaches that train a model by downloading satellite data to a ground station (GS) are not practical, Federated Learning (FL) offers a potential solution. However, existing FL approaches cannot be readily applied because of their excessively prolonged training time caused by the challenging satellite-GS communication environment. This paper proposes FedHAP, which introduces high-altitude platforms (HAPs) as distributed parameter servers (PSs) into FL for Satcom (or more concretely LEO constellations), to achieve fast and efficient model training. FedHAP consists of three components: 1) a hierarchical communication architecture, 2) a model dissemination algorithm, and 3) a model aggregation algorithm. Our extensive simulations demonstrate that FedHAP significantly accelerates FL model convergence as compared to state-of-the-art baselines, cutting the training time from several days down to a few hours, yet achieving higher accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call