Abstract

In robotics, many path-planning algorithms have been proposed but almost all are evaluated only in 2-D computer simulations and a few experiments without any human being. Recently, a human lives with many mobile and/or pet robots. Therefore, such a path-planning algorithm, especially, many interactions between a human and robots should be evaluated in our living space. However, a practical test between a human robot and robots in a real space is expensive and dangerous. For this purpose, we develop a wearable computer system to check a path-planning algorithm installed in each robots by a human being. This system mainly consists of HDS (human detecting system) and HMD (head mounted display). Firstly, while walking an arbitrary 3-D space, we detect position and/or orientation of a human by four omnidirectional visions. Then, we obtain a precise position sequence from an observed one by our observation space method. Synchronously, the human virtually walks in its 3-D graphics environment provided by HMD. In the 3-D virtualized world, a human avoids some of mobile robots by using two eyes and two legs. In this world, each robot is controlled by a given path-planning algorithm. Using this wearable system, a feasibility of each path planning algorithm is practically checked by visual and dynamical abilities of a human being. This research gives us some experimental results concerning to this.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call