The development of human-computer interaction has seen a major change in the last few years, moving in the direction of more user-friendly and effective interfaces. This tendency has prompted researchers to investigate new input techniques, especially in the field of gesture-based control systems. By utilizing both hand and eye gestures to create a gesture-based virtual mouse and keyboard, this paper introduces a novel method of human-computer interaction. The suggested system offers users a smooth and simple way to interact with computers by combining hand gesture recognition and eye-tracking technology. Users can control cursor movement and keyboard inputs without the use of physical peripherals by using the system to track their hand and eye movements and interpret them as commands. The incorporation of eye-tracking technology improves the system's accuracy and responsiveness, enabling more precise cursor control and interaction with elements displayed on the screen. Hand gesture recognition, on the other hand, gives users an ergonomic and natural way to input data, allowing them to carry out commands with straightforward motions. The architecture of the system combines real-time processing capabilities to guarantee low latency and high responsiveness with sophisticated machine-learning algorithms for gesture recognition. Additionally, users can define their gesture commands according to their needs and preferences because the system is flexible and customizable. The suggested gesture-based virtual mouse and keyboard system has undergone rigorous testing and evaluation, and the results show promising performance and usability in a range of scenarios and applications. This novel method of interacting with computers has the potential to improve user experience, accessibility, and productivity in a variety of computing contexts. Keywords: Human-Computer Interaction, Gesture Recognition, Eye-Tracking
Read full abstract