Abstract

Human Computer Interaction with hand gesture has become increasingly popular today. The physical mouse device whether it is our basic desktop mouse or laptop touch pad, it require physical contact of user to convey input. Many advanced technologies have came but they all require physical contact. Hence we need a ubiquitous interface to do this. Our method is to use a camera device and computer gesture and voice based technology, like image segmentation as well as gesture recognition, to control mouse actions such as left click, double click, right click and dragging. We can show how it can do the things that present mouse devices can do. Our project Gesture and Voice Based PC Control is a real time system capable of understanding mouse commands given by hand gestures and voice commands. The end user is able to communicate with computer with the commands given by hand gestures and voice commands. This will avoid the need of physical contact to the computer to control mouse inputs. In this way the interface becomes ubiquitous. Introduction As computer technology continues to improve, people are having smaller and smaller electronic. Devices. They want to use these devices ubiquitously. There is need of a new interfaces designed specifically for use with these electronics devices. We are increasingly recognizing the importance of HCI, and in particularly gesture based gesture recognition. Simple interfaces already exist, such as embedded keyboard, folder keyboard and mini keyboard [2] . However, these interfaces need some amount of space to use as well as can’t be used in motion. Touch screens devices are also providing a good control interface and nowadays it is used globally in many applications [3] . But the cost and other hardware limitations makes it limited. With application of gesture technology and controlling the mouse actions by natural hand gestures, we can reduce the space required. We propose a good approach that uses a camera to control the mouse actions. Problem Definition To develop a software solution to a problem, the first step is to understand the problem. Mouse is a physical device, subject to mechanical wear and tear also demands the user to make physical contact to convey his Input, which is no longer appropriate. Natural actions in human to human communication, such as speak and gesture, seem more appropriate. Interfaces based on computational perception and computer gesture should be available for accomplishing the goals of ubiquitous computing [5] . The problem here is to develop a way so that humans can interact with a computer without having any physical contact with the computer. Need for the New System Mouse is a physical device. This device is subject to mechanical wear and tear. It requires the user to make physical contact to convey his Input, which is not appropriate. Instead, natural actions in human-to-human communication, such as speak and gesture, seem more appropriate [5] . This

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call