Abstract

This paper presents a novel method of sonar vision, called side sonar vision (SSV), to navigate mobile robots in a known environment. It adopts Omni-directional images and divides surrounding sonar vision into three parts: front, right and left sides. These sides are under continuous scrutiny of individual agents. SSV analyses data of each side, separately, and produces two key parameters: angle and length. The parameters are sent to multi-layer controlling module of navigation that has two main nodes: path estimation and trajectory. The proposed method does not require any calibration or image conversion. The experiments show that the robot moves the way smoothly without colliding obstacle. It could track up to 98% of the path, automatically, without any collision with obstacles. The process time for this work was about 120 ms.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call