Abstract
With the upcoming next generation wireless network, vehicles are expected to be empowered by artificial intelligence (AI). By connecting vehicles and cloud server via wireless communication, federated learning (FL) allows vehicles to collaboratively train deep learning models to support intelligent services, such as autonomous driving. However, the large number of vehicles and increasing size of model parameters bring challenges to FL-empowered connected vehicles. Since communication bandwidth is insufficient to upload full-precision local models from numerous vehicles, model compression is usually conducted to reduce transmitted data size. Nevertheless, conventional model compression methods may not be practical for resource-constrained vehicles due to the increasing computational overhead for FL training. The overhead for downloading global model can also be omitted by existing methods since they are originally designed for centralized learning instead of FL. In this paper, we propose a ternary quantization based model compression method on communication-efficient FL for resource-constrained connected vehicles. Specifically, we firstly propose a ternary quantization based local model training algorithm that optimizes quantization factors and parameters simultaneously. Then, we design a communication-efficient FL approach that reduces overhead for both upstream and downstream communications. Finally, simulation results validate that the proposed method demands the lowest communication and computational overheads for FL training, while maintaining desired model accuracy compared to existing model compression methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.