Abstract

Federated Learning (FL) is a novel distributed learning paradigm in which local learning models are simultaneously trained using the stored data on multiple devices, then ultimately aggregated into a global model. A promising use case of FL is the training of a global model using the data collected by unmanned aerial vehicles (UAVs) during their flight, which is invaluable in scenarios in which an infrastructure cannot be accessed (e.g., disaster). However, this is challenging as limited resources are to be distributed between flight time, sensing, processing, and communication. In this paper, we address the resource problem for a set of heterogeneous UAVs with different computation and communication capabilities from distributed point of view. We propose the usage of Device-to-Device (D2D) communication to fairly distribute the data so-far collected by UAVs with different capabilities by posing it as an optimal transport problem. Our contribution is two-fold: (1) We obtain the fairest distribution of data given the UAVs’ computational capabilities such that global learning time is minimal; (2) We devise a scheme using Optimal Transport (OT) to achieve such a fair distribution between UAVs. The performance of the proposed techniques is demonstrated in an FL setting with different UAV topologies with the FL training done using the MNIST dataset.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call