Abstract

In this paper, we propose a new interface to control VR(Virtual reality) contents, games, and animations in real-time using the user’s breath and the acceleration sensor of a mobile device. Although interaction techniques are very important in VR and physically-based animations, UI(User interface) methods using different types of devices or controllers have not been covered. Most of the proposed interaction techniques have focused on screen touch and motion recognition. The direction of the breath is calculated using the position and angle between the user and the mobile device, and the control position to handle the contents is determined using the acceleration sensor built into the mobile device. Finally, to remove the noise contained in the input breath, the magnitude of the wind is filtered using a kernel modeling a pattern similar to the actual breath. To demonstrate the superiority of this study, we produced real-time interaction results by applying the breath as an external force of VR contents, games, and animations.

Highlights

  • As the VR and AR(Augmented reality) fields have emerged as hot issues, related industries, such as game, medical, military, simulation, and film, are developing very much

  • We present an interface technique that changes the control position according to the orientation of the mobile device

  • Since the breath is blown with the mobile device facing in front, the result of the vector flowing upward is seen in real-time

Read more

Summary

Introduction

As the VR and AR(Augmented reality) fields have emerged as hot issues, related industries, such as game, medical, military, simulation, and film, are developing very much. The user directly blows the breath on the mobile device to determine the direction and magnitude of the wind and controls the contents by calculating the control position to which the breath is applied using the acceleration sensor. We produced interactive results in which the user controls the physically based fluid simulations, VR contents, and games by breath in real time according to the position and orientation of the mobile device and the magnitude of the breath. We present an interface technique that changes the control position according to the orientation of the mobile device Based on this technique, the user can change the target position to which the breath is applied, rather than controlling the contents at a fixed position. The control points and breath orientation and magnitude calculated by the mobile device were applied to the VR, games, and animation environments for real-time control

Related work
90 À at 10
Results
Evaluation
Discussion and implementation
Conclusions and future work

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.