The heterogeneous cluster networks (HCN) have recently benefited from federated learning (FL). On distributed data, FL is used to train privacy-preserving models. In heterogeneous networks (HetNet) and the Internet of Things (IoT), FL implementation is challenged by resource optimization, robustness, and security issues. There is a significant risk of data security being compromised by disregarding the nodes’ clustering behavior and quickly varying asynchronous streaming data. Moreover, in HCN-based wireless sensor networks (WSNs), FL enhances asynchronous node performance. Using naturally clustered HCN, distributed nodes train a local and global model collectively. In this paper, we propose a Intra-Clustered FL (ICFL) model. By optimizing computation and communication, ICFL selects heterogeneous FL nodes in each cluster. Despite heterogeneous data, it is highly robust. There are currently no FL frameworks that can handle varying data quality across devices and non-identical distributions. With ICFL, sensitive asynchronous data is not exposed to possible misuse while adapting to changing environments. In addition to being time-efficient, our strategy requires low-power computing nodes. According to our extensive simulation results, ICFL performs better than FedCH in terms of computational performance and provides flexible conditions under which ICFL is more efficient in terms of communication. In extensive testing, ICFL decreased training rounds by 62% and increased accuracy by 6.5%. It can execute evaluations 7.46 times faster than existing models, and its average accuracy has increased by 4.39%. A resource-aware FL system can be successfully implemented in real-time applications according to our research.
Read full abstract