Abstract

This study presents the design, simulation and implementation of a fuzzy logic-based joint controller (FLJC) on a six-degrees of freedom (6-DOF) robotic arm with machine vision feedback capable of picking an object and placing it to a predetermined position. A 4-DOF M100RAK robotic arm coupled with 2-DOF gripper is used as a platform in the implementation of the FLJC. The control of robotic arm system is a closed-loop system. The inputs to the FLJC are the joint angles, gripper coordinates and target object coordinates. Input joint angles are measured using MPU6050 Six-Axis (Gyro+Accelerometer) sensor. Machine vision system composed of Kinect camera and an opensource Processing software are used to extract the coordinates of the gripper and the target within the image of the workspace. Computed joint angles from FLJC are transmitted through communication protocol packets to Arduino microcontroller for servo control.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call