Abstract

Spatial information understanding is fundamental to visual perception in Metaverse. Beyond the stereoscopic visual cues naturally carried in Metaverse, the human vision system may use other auxiliary information provided by any shadow casting or motion parallax available to perceive the 3D virtual world. However, the combined use of shadows and motion parallax to improve 3D perception have not been fully studied. In particular, when visualizing the combination of volumetric data and associated skeleton models in VR, how to provide the auxiliary visual cues to enhance observers' perception of the structural information is a key yet underexplored topic. This problem is particularly challenging for visualization of data in biomedical research. In this paper, we focus on immersive analytics in neurobiology where the structural information includes the relative position of objects (nuclei / cell body) in the 3D space and the spatial measurement and connectivity of segments (axons and dendrites) in a model. We present a perceptual experiment designed for understanding the consequence of shadow casting and motion parallax in the neuron structures observation and the feedback and analysis of the experiment are reported and discussed.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call