Abstract

This paper introduces a new prototype system for controlling a PC by head movements and also with voice commands. Our system is a multimodal interface concerned with controlling the computer. The selected modes of interaction are speech and gestures. We are seeing the revolutionary of computers and information technologies into daily practice. Healthy people use keyboard, mouse, trackball, or touchpad for controlling the PC. However these peripheries are usually not suitable for handicapped people. They may have problems using these standard peripheries, for example when they suffer from myopathy, or cannot move their hands after an injury. Our system has been developed to provide computer access for people with severe disabilities. This system tracks the computer user’s Head movements with a video camera and translates them into the movements of the mouse pointer on the screen and the voice as button presses. Therefore we are coming with a proposal system that can be used with handicapped people to control the PC.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.