Abstract

The ball-catching system examined in this research, which was composed of an omni-directional wheeled mobile robot and an image processing system that included a dynamic stereo vision camera and a static camera, was used to capture a thrown ball. The thrown ball was tracked by the dynamic stereo vision camera, and the omni-directional wheeled mobile robot was navigated through the static camera. A Kalman filter with deep learning was used to decrease the visual measurement noises and to estimate the ball’s position and velocity. The ball’s future trajectory and landing point was predicted by estimating its position and velocity. Feedback linearization was used to linearize the omni-directional wheeled mobile robot model and was then combined with a proportional-integral-derivative (PID) controller. The visual tracking algorithm was initially simulated numerically, and then the performance of the designed system was verified experimentally. We verified that the designed system was able to precisely catch a thrown ball.

Highlights

  • The method in which visual information in a feedback control loop is used to precisely control the motion, position, and posture of a robot is called visual servoing

  • In [2], two cameras were placed at the top and left of the work area for ping-pong ball catching and ball juggling

  • This system consisted of an omni-directional wheeled mobile robot and an image processing system that included an active stereo vision camera and a static vision camera

Read more

Summary

Introduction

The method in which visual information in a feedback control loop is used to precisely control the motion, position, and posture of a robot is called visual servoing. In robotic ball-catching, there are several variations in the system configurations, methods of implementation, trajectory prediction algorithms, and control laws. The eye-in-hand concept enables the camera to maneuver along with the robotic arm, which improves the trajectory prediction precision in an open space, especially when the object is near the robot. Another method by which to catch a ball in a wide-open space is to combine a static stereo vision system with a mobile robot to complete the ball-catching task, as described below. We developed a combined omni-directional wheeled mobile robot and a multi-camera vision system to catch a flying ball in a large workspace.

Image Processing and Visual Measurement
Active Stereo Vision
Trajectory Estimation and Prediction
Controller Design of the Omni-Directional Wheeled Mobile Robot
Implementation of the Designed System
Touchdown Point Prediction
Findings
Concluding Remarks
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call