Abstract

Virtual reality (VR) is a technology that is gaining traction in the consumer market. With it comes an unprecedented ability to track body motions. These body motions are diagnostic of personal identity, medical conditions, and mental states. Previous work has focused on the identifiability of body motions in idealized situations in which some action is chosen by the study designer. In contrast, our work tests the identifiability of users under typical VR viewing circumstances, with no specially designed identifying task. Out of a pool of 511 participants, the system identifies 95% of users correctly when trained on less than 5 min of tracking data per person. We argue these results show nonverbal data should be understood by the public and by researchers as personally identifying data.

Highlights

  • Virtual reality (VR) is a technology that is gaining traction in the consumer market

  • In contrast to previous work, which has focused on designing VR tasks to identify or authenticate ­users[4,5,6], we begin with a task that was not designed for identification

  • The question that must be asked first is whether VR produces identifying information

Read more

Summary

Related work

Previous work has used tracking data to identify users, but identification is framed positively, often as a tool for authentication. VR tracking data, as a measure of body pose and motion, is a surprisingly powerful source of information. There is growing literature in the use of tracking data to diagnose d­ ementia[14,15,16] Previous work on identifying users of head-mounted displays have used different types of biometric measurements. Authentication requires strong positive evidence identifying a single user from any other user of a system, and usually leads to elevated privileges (e.g., access to sensitive data, or ability to run more powerful commands). Little work has been done in determining identity from data gathered from an experience designed with no intention of identification

Methods
Results
Discussion
Limitations and future work
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call