In the rapidly advancing landscape of machine learning, Federated Learning (FL) stands as a transformative paradigm, preserving data privacy and overcoming challenges in training models. Existing FL research has primarily concentrated on refining the learning process, often overlooking the critical aspect of minimizing communication costs, especially for clients with limited resources. This article introduces a groundbreaking approach, Explainable AI Empowered Resource Management for Enhanced Communication Efficiency in Hierarchical Federated Learning (XRM-HFL). XRM-HFL utilizes an explainable AI-driven resource management strategy, precisely predicting computational resource requirements. The hierarchical framework minimizes communication burdens by assigning communication with the cloud server to a designated Node Controller (NC) via edge aggregation. Empirical evaluations demonstrate that XRM-HFL outperforms prior methods, facilitating efficient communication, resource utilization, and faster convergence. The model exhibits notable decreases in intercommunication expenses, global costs, failure nodes, and execution time, along with an increase in resource utilization. XRM-HFL provides resource-constrained clients with an effective solution for federated learning, minimizing costs while preserving model accuracy.
Read full abstract