Abstract

AbstractThis paper investigates the effect of using augmented reality (AR) annotations and two different gaze visualizations, head pointer (HP) and eye gaze (EG), in an AR system for remote collaboration on physical tasks. First, we developed a spatial AR remote collaboration platform that supports sharing the remote expert’s HP or EG cues. Then the prototype system was evaluated with a user study comparing three conditions for sharing non-verbal cues: (1) a cursor pointer (CP), (2) HP and (3) EG with respect to task performance, workload assessment and user experience. We found that there was a clear difference between these three conditions in the performance time but no significant difference between the HP and EG conditions. When considering the perceived collaboration quality, the HP/EG interface was statistically significantly higher than the CP interface, but there was no significant difference for workload assessment between these three conditions. We used low-cost head tracking for the HP cue and found that this served as an effective referential pointer. This implies that in some circumstances, HP could be a good proxy for EG in remote collaboration. Head pointing is more accessible and cheaper to use than more expensive eye-tracking hardware and paves the way for multi-modal interaction based on HP and gesture in AR remote collaboration.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call