Abstract
In past decades, much progress has been obtained in vision-based robot control theories with traditional image processing methods. With the advances in deep-learning-based methods, convolutional neural network (CNN) has now replaced the traditional image processing methods for object detection and recognition. However, it is not clear how the CNN-based methods can be integrated into robot control theories in a stable and predictable manner for object detection and tracking, especially when the aspect ratio of the object is unknown and also varies during manipulation. In this article, we develop a vision-based control method for robots with an eye-in-hand configuration, which can be directly integrated with existing CNN-based object detectors. The task variables are generated based on parameters of the bounding box from the output of any real-time CNN object detector such as you only look once (Yolo). To address the chattering problem of bounding box, long short-term memory (LSTM) is used to provide smoothed bounding box information. A vision-based controller is then proposed following task-space motion control design formulation in order to keep the object of unknown aspect ratio in the center of field of view of the camera. The stability of the overall closed-loop control system is analyzed rigorously using the Lyapunov-like approach. Experimental results are presented to illustrate the performance of the proposed CNN-based robot controller.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Systems, Man, and Cybernetics: Systems
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.