Abstract

This work presents a compact stand-alone orientation system based on the visual SLAM (Simultaneous Localization and Mapping). Unlike other modern approaches our SLAM algorithm was developed using error models of rate sensors and line-of-sight measurement of unique features extracted from the video stream, delivered by the monocular camera. Our approach allows seamless (tight) integration of inertial measurement units (IMU) and exogenous measurements, provided by the wide array of range and angular sensors such as radars, LIDARs, etc. The developed algorithm is implemented in an NVIDIA Jetson Nano computer (at just 100×80 mm) including a dedicated cooling system. The total weight of the system is 240 grams and power usage is 5 Watt.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call