Abstract
The mechanical arm (manipulator) of a robot can be controlled using image processing techniques. The system detects user motion and colour in real time. The main goal is to gather user motions and colour from a video stream captured by a webcam facing the user. To track the motion of the user's hand; it needs to select the Image feature that is both robust and the one which is ideal for real time tracking of moving points across consecutive image frames. The intuition is that if it randomly picks a point on a uniform surface to track, and then it not likely to find the same point in the next frame. On the other hand, if it chooses a unique point that is more invariant under motion or luminance changes, it has more chance of finding the point again and the tracking will be more robust. The feature Extraction and the tracking algorithm are more efficient in this system. The colour detection is done by converting the image into grayscale and the intensity is been calculated and the respected colour is been tracked, by doing so the colour of the object is been detected, the user wears a hand glove and makes gestures before the camera hence the system detects the colour and the motion of the user and puts into action.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.