Abstract

The state space of a sensor-based robot in the most previous works has been determined based on human intuitions, however the state space constructed from human viewpoints is not always appropriate for the robot. The robot has a different body, sensors, and tasks, therefore, we consider the robot should have an original internal state space determined based on actions, sensors, and tasks. This paper proposes an approach to construct such a robot oriented state space by statistically analyzing the actions, sensor patterns, and rewards given, as results of task executions. In the state space construction, the robot creates sensor pattern classifiers called empirically obtained perceivers (EOPs) the combinations of which represents internal states of the robot. We have confirmed that the robot can construct original state spaces through its vision sensor and achieve navigation tasks with the obtained state spaces in a complicated simulated world.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call