Abstract

In 6G Internet of Vehicles (IoV) system, Federated Learning (FL) is usually used to structure the joint training model between vehicles and RSU. However, due to the mobility of the vehicles, the link between vehicles is unstable and the parameters trained by FL are exchanged frequently, which may increase the communication overheads. Therefore, we propose a communication-efficient Federated Double Distillation (FedDD) framework in this paper. In particular, the cluster-heads are dynamically selected as the distributed learning clients combined with three-dimensional attributes to improve the collaborative transmission efficiency. Then, the knowledge distillation is further integrated into the federated learning framework to reduce communication overheads caused by the frequent parameters exchange in the instable link. The experimental results show that, compared with the benchmark FedAvg algorithm, the FedDD reduces the communication overheads by three orders of magnitude. Moreover, the FedDD improves the communication efficiency of FL while sacrificing only a small amount of accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call