Abstract

In collaborative missions, it is crucial for each agent to have a clear understanding of the status among others. Therefore, various related works have been conducted, including virtual reality interactions and real-life swarm jobs. However, these types of collaborations still face certain limitations: (i) Limited distance: In virtual reality, users interact with others over a network within a predefined area and are unable to move over longer distances. (ii) Weak/non-GPS signals: Swarm jobs, which rely solely on localization information like GPS, often suffer from drift or interruptions. Moreover, significant location devices are rarely deployed in consumer-grade products. To address these challenges, we propose a multi-agent collaborative localization framework based on Camera, IMU (Inertial Measurement Unit), and a remote interaction module. This framework not only achieves globally consistent localization but also overcomes the distance limitations of WIFI or Bluetooth. Additionally, we tackle the issue of scale ambiguity in visual-inertial collaborative localization. Simulations and physical experiments validate the effectiveness of our proposed multi-agent collaboration with remote interaction. Unlike traditional applications that depend on external devices like Motion Capture Devices or GPS, our system achieves globally consistent localization internally.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call