This research explores the feasibility of using federated learning techniques to train machine learning models directly on edge devices interacting through REST APIs. This article designs an architecture that facilitates federated learning using REST APIs for communication between a central server and edge nodes. The architecture ensures efficient model distribution, update aggregation, and iterative global model refinement. This research implements this system in a simulated environment to evaluate its performance. Initial results show that federated learning significantly reduces data transfer and enhances privacy without compromising model accuracy. The REST API-based communication model effectively coordinates updates between the central server and edge devices, and privacy-preserving techniques maintain data security throughout the process. This study confirms that federated learning is a viable solution for edge computing, addressing key challenges in data privacy and communication efficiency, and opens pathways for scalable and effective distributed learning systems in real-world IoT applications. Federated learning offers a decentralized alternative, enabling local model training on edge devices with only model updates shared with a central server. This approach minimizes data transfer and enhances privacy.
Read full abstract