Abstract
We introduce our approach that makes a robot learn to behave adequately to accomplish a given task at hand through the interactions with its environment with less a priori knowledge about the environment or the robot itself. We briefly present three research topics of vision-based robot learning in each of which visual perception is tightly coupled with actuator effects so as to learn an adequate behavior. First, a method of vision-based reinforcement learning by which a robot learns to shoot a ball into a goal is presented. Next, “motion sketch” for a one-eyed mobile robot to learn several behaviors such as obstacle avoidance and target pursuit is introduced. Finally, we show a method of purposive visual control consisting of an on-line estimator and a feedback/feedforward controller for uncalibrated camera-manipulator systems. All topics include the real robot experiments.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.