Abstract

6G will be the next horizon from connected people and things to intelligence-of-everything. Machine learning (ML) combines artificial intelligence with vehicular edge computing (VEC), further releasing the potential of data. Federated learning (FL) is used to manage the collaborative training of vehicular clients and the road side unit (RSU), which can efficiently complete the distributed data processing. However, the vehicular clients upload a large number of model parameters to occupy regular communication resources, and the stability of the link cannot be guaranteed due to the mobility of the vehicles, which may lead to expensive communication overhead in VEC. Thus, this paper proposes a communication-efficient federated double distillation (FedDD) framework, which comprehensively considers the three-dimensional attributes of the vehicles and dynamically selects the cluster-heads (CHs) to improve the transmission efficiency. Then, knowledge distillation is further integrated into the federated learning to accomplish multiple rounds of parameter distillation, thereby significantly reducing the communication overhead. Experimental results show that compared to traditional FedAvg, FedDD can reduce communication overhead by three orders of magnitude. The communication overhead of FedDD is reduced by 82% compared to FTTQ, which improves the communication efficiency of FL while sacrificing a small amount of accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call