Abstract

Large-scale display system with immersive human–computer interaction (HCI) is an important solution for virtual reality (VR) systems. In contrast to the traditional human–computer interactive VR system that requires the user to wear heavy VR headsets for visualization and data gloves for HCI, the proposing method utilizes a large-scale display screen (with or without 3D glasses) to visualize the virtual environment and a bare-handed gesture recognition solution to receive user instructions. The entire framework is named as an immersive HCI system. Through a virtual 3D interactive rectangular parallelepipe, we establish the correspondence between the virtual scene and the control information. A bare-handed gesture recognition method is presented based on extended genetic algorithm. An arm motion estimation method is designed based on the fuzzy predictive control theory. Experimental results showed that the proposed method has lower error rates than most existing solutions with acceptable recognition frequency rates.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.