Mixed reality remote collaboration assembly is a type of computer-supported collaborative assembly work that uses mixed reality technology to enable spatial information and collaboration status sharing among geographically distributed collaborators, including remote experts and local users. However, due to the abundance of mixed virtual and real-world information in the MR space and the limitations imposed by narrow field-of-view augmented reality (AR) glasses, users face challenges in effectively focusing on relevant and valuable visual information. Our research aims to enhance users' visual attention to critical guidance information in MR collaborative assembly tasks, thereby improving the clear expression of instructions and facilitating the transmission of collaborative intention. We developed the Information Recommendation and Visual Enhancement System (IRVES) through an assembly process information hierarchy division mechanism, a content-based information recommendation system, and a gesture interaction-based information visual enhancement method. IRVES can leverage the guidance expertise and preferences of remote experts to recommend information to filter out irrelevant information and present the key information that the remote expert conveys to the local user in an intuitive way through visual enhancement. We conducted a user study experiment of a collaborative assembly task of a small engine in a laboratory environment. The experimental results indicate that IRVES outperforms traditional MR remote collaborative assembly methods (VG3DV) in terms of time performance, operational errors, cognitive performance and user experience. Our research contributes a human-centered information visualization approach for remote experts and local users, providing a novel method and idea for designing visual information interfaces in MR remote collaboration assembly tasks.
Read full abstract