Abstract

In this article, we propose a fog-enabled federated learning framework—FogFL—to facilitate distributed learning for delay-sensitive applications in resource-constrained IoT environments. While federated learning (FL) is a popular distributed learning approach, it suffers from communication overheads and high computational requirements. Moreover, global aggregation in FL relies on a centralized server, prone to malicious attacks, resulting in inefficient training models. We address these issues by introducing geospatially placed fog nodes into the FL framework as local aggregators. These fog nodes are responsible for defined demographics, which help share location-based information for applications with similar environments. Furthermore, we formulate a greedy heuristic approach for selecting an optimal fog node for assuming a global aggregator’s role at each round of communication between the edge and cloud, thereby reducing the dependence on the execution at the centralized server. Fog nodes in the FogFL framework reduce communication latency and energy consumption of resource-constrained edge devices without affecting the global model’s convergence rate, thereby increasing the system’s reliability. Extensive deployment and experimental results corroborate that, in addition to a decrease in global aggregation rounds, FogFL reduces energy consumption and communication latency by 92% and 85%, respectively, as compared to state of the art.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.