Abstract

Microsoft Kinect sensor has shown the research community that it's more than just an interactive gaming device, due to its multi-functional abilities and high reliability. In this work, online HIL (Hardware-in-the-Loop) experimental data are used to apply human motion imitation to a 2-degree of freedom Lego Mind storm NXT robotic arm. A model simulation of the dc motor used in this experiment is also present in this paper. The acquired input data from the Kinect sensor are processed in a closed loop PID controller with feedback from motors encoders. The applied algorithms solve the overlapping input problem, conducting a simultaneous control of both shoulder and elbow joints, and solving the overlapping input problem as well. The work in this paper is presented as a prototype to assure the applicability of the algorithms, for further development.

Highlights

  • Three-dimensional data recognition and processing are recently a vital research area for many modern applications in Mechatronic and the field of engineering in general

  • Kinect, which is a motion detection and recognition smart sensor developed by the engineers at Microsoft, allows human/computer interaction needless to any physical controllers

  • The processor inside the Kinect recognizes human movement patterns to generate a corresponding skeleton model that can be provided in a computer environment such as Matlab [2]-[8]

Read more

Summary

Introduction

Three-dimensional data recognition and processing are recently a vital research area for many modern applications in Mechatronic and the field of engineering in general. Kinect, which is a motion detection and recognition smart sensor developed by the engineers at Microsoft, allows human/computer interaction needless to any physical controllers. The processor inside the Kinect recognizes human movement patterns to generate a corresponding skeleton model that can be provided in a computer environment such as Matlab [2]-[8]. The system recognized the user’s hand-gestures that were captured by the Kinect sensor. The algorithm translated those gestures into predefined commands, to be processed and sent over to the robot [13]-[15]. An algorithm is developed to extract the angles from the detected skeleton model by the Kinect sensor. Two of the angles are sent to the NXT robot to control two linked servo motors representing the shoulder and the elbow joints

Mathematical Model
Simulation and Preliminary Results
Results and Analysis
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.