Steering is a fundamental task in interactive Virtual Reality (VR) systems. Prior work has demonstrated that movement direction can significantly influence user behavior in the steering task, and different interactive environments (VEs) can lead to various behavioral patterns, such as tablets and PCs. However, its impact on VR environments remains unexplored. Given the widespread use of steering tasks in VEs, including menu adjustment and object manipulation, this work seeks to understand and model the directional effect with a focus on barehand interaction, which is typical in VEs. This paper presents the results of two studies. The first study was conducted to collect behavioral data with four categories: movement time, average movement speed, success rate, and reenter times. According to the results, we examined the effect of movement direction and built the SθModel. We then empirically evaluated the model through the data collected from the first study. The results proved that our proposed model achieved the best performance across all the metrics (r2 > 0.95), with more than 15% improvement over the original Steering Law in terms of prediction accuracy. Next, we further validated the SθModel by another study with the change of device and steering direction. Consistent with previous assessments, the model continues to exhibit optimal performance in both predicting movement time and speed. Finally, based on the results, we formulated design recommendations for steering tasks in VEs to enhance user experience and interaction efficiency.
Read full abstract