Abstract

To create immersive Virtual Reality (VR) applications and training environments, an appropriate method for allowing participants to interact with the virtual environments and objects in that scene must be considered. An approach which offers increased immersion and accurately models real-world behaviour would be advantageous – particularly within the areas of health, entertainment and engineering. Traditional consumer VR methods for facilitating interaction – e.g. controllers – are restricted by lack of tactile feed back and do not accurately represent real-world interactions with physical objects in terms of shape, limiting immersion. Ideally, physical objects would be transported into the virtual world and used as a means of interacting with the environment, via a robust tracking algorithm or motion capture system. However, achieving this in a real time markerless manner for a range of object types remains an open challenge. Moreover, costly motion capture systems or tracking algorithms which require multiple cameras are not practical for everyday VR use.Given the recent advancements in object tracking and mesh reconstruction using neural networks, we present a novel neural network, VRProp-Net+, which predicts rigid and articulated model parameters of known everyday objects in unconstrained environments from RGB images at interactive frame rates. VRProp-Net + utilises a novel synthetic training methodology and so does not require a time consuming capture or manual labelling procedure seen in many prominent neural network tracking or mesh reconstruction approaches. We present our network as part of an egocentric tracking framework, that allows prediction of object pose and shape given a moving camera. This scenario is helpful for practical VR experiences where the camera may be mounted on the VR Head Mounted Display (HMD) itself. This creates a dynamic capture volume, allowing a user to move around and interact with a virtual world without the need for multiple fixed cameras.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call