Abstract

With the rapid advancement of computer technology, it is becoming increasingly important for us to develop new ways of interacting with computers. From CLI (Command Line Interface) to GUI (Graphical User Interface) the growth has been astounding. The future is going to be NUI (Natural User Interface) where the user makes use of one's natural environment to provide input to machine. Imagine playing Video games likes Temple Run, Subway Surfers or even play a musical instrument on your computer without touching, this way also helps in improving the digital well-being of humans. NUI/NUX will also be a lot helpful during the time of pandemic where we can reduce touch in many public places. An input module, such as a virtual mouse that uses Object Detection, Object Tracking and Gestures to assist us in communicating, could be a viable replacement to the traditional touch screen and hardware mouse. The system design proposed is a TensorFlow-based mouse controlling system, which uses hand gestures that are captured through a webcam using an RGB color scheme. This solution permits users to control the system cursor using their hand, that the computer webcam tracks and perform mouse operations like Left click, Right click, Scroll, Drag, Move using different hand gestures. Python, TensorFlow and OpenCV libraries are used for real time computer vision to implement the system.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call