Abstract

Few HMD-based virtual environment systems display a rendering of the user's own body. Subjectively, this often leads to a sense of disembodiment in the virtual world. We explore the effect of being able to see one's own body in such systems on an objective measure of the accuracy of one form of space perception. Using an action-based response measure, we found that participants who explored near space while seeing a fully-articulated and tracked visual representation of themselves subsequently made more accurate judgments of absolute egocentric distance to locations ranging from 4 m to 6 m away from where they were standing than did participants who saw no avatar. A nonanimated avatar also improved distance judgments, but by a lesser amount. Participants who viewed either animated or static avatars positioned 3 m in front of their own position made subsequent distance judgments with similar accuracy to the participants who viewed the equivalent animated or static avatar positioned at their own location. We discuss the implications of these results on theories of embodied perception in virtual environments.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.