Abstract

Computers have become popular in recent years. The forms of human-computer interaction are increasingly diverse. In many cases, controlling the computer is not only through the mouse and keyboard, but humans must control the computer through body language and representation. For some people with physical disabilities, controlling the computer through hand movements is essential to help them interact with the computer. The field of simulation also needs these interactive applications. This paper studies a solution to build a hand tracking and gesture recognition system that allows cursor movement and corresponding actions with mouse and keyboard. The research team confirms that the system works stably, accurately and can control the computer instead of a conventional mouse and keyboard through the implementation and evaluation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call