Abstract
This paper discusses a spatial sensor to identify and track objects in the environment. The sensor is composed of an RGB-D camera that provides point cloud and RGB images and an egomotion sensor able to identify its displacement in the environment. The proposed sensor also incorporates a data processing strategy developed by the authors to conferring to the sensor different skills. The adopted approach is based on four analysis steps: egomotive, lexical, syntax, and prediction analysis. As a result, the proposed sensor can identify objects in the environment, track these objects, calculate their direction, speed, and acceleration, and also predict their future positions. The on-line detector YOLO is used as a tool to identify objects, and its output is combined with the point cloud information to obtain the spatial location of each identified object. The sensor can operate with higher precision and a lower update rate, using YOLOv2, or with a higher update rate, and a smaller accuracy using YOLOv3-tiny. The object tracking, egomotion, and collision prediction skills are tested and validated using a mobile robot having a precise speed control. The presented results show that the proposed sensor (hardware + software) achieves a satisfactory accuracy and usage rate, powering its use to mobile robotic. This paper’s contribution is developing an algorithm for identifying, tracking, and predicting the future position of objects embedded in a compact hardware. Thus, the contribution of this paper is to convert raw data from traditional sensors into useful information.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.