Abstract

Localisation in conventional autonomous robot navigation using centralised on-board intelligence faces many challenging issues. Steering away from the centralised on-board solution, this paper proposes a distributed approach by deploying distributed wireless nodes that comprise of vision sensors and embedded computers to process images, localise the robot and dynamic obstacles, navigate and control the robot. This schema releases massive on-board information processing required by conventional autonomous robot to its operation environment. The objective is to enable robots with limited intelligence to carry out complex navigation functions. The architectural design and realisation for connecting wireless vision sensors in the environment for global robot navigation path planning and local motion control is described. The paper gives a review of the proposed algorithms. The details of the wireless sensor network connectivity, system functions and modules, organisation and connection of the system components are described. The hardware and software realisation and experiment results are also discussed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call