Abstract

Cooperative computing is promising to enhance the performance and safety of autonomous vehicles benefiting from the increase in the amount, diversity as well as scope of data resources. However, effective and privacy-preserving utilization of multi-modal and multi-source data remains an open challenge during the construction of cooperative mechanisms. Recently, Transformers have demonstrated their potential in the unified representation of multi-modal features, which provides a new perspective for effective representation and fusion of diverse inputs of intelligent vehicles. Federated learning proposes a distributed learning scheme and is hopeful to achieve privacy-secure sharing of data resources among different vehicles. Towards privacy-preserving computing and cooperation in autonomous driving, this paper reviews recent progress of Transformers, federated learning as well as cooperative perception, and proposes a hierarchical structure of Transformers for intelligent vehicles which is comprised of Vehicular Transformers, Federated Vehicular Transformers and the Federation of Vehicular Transformers to exploit their potential in privacy-preserving collaboration.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call