Abstract

Nowadays, Augmented-Reality (AR) head-mounted displays (HMD) deliver a more immersive visualization of virtual contents, but the available means of interaction, mainly based on gesture and/or voice, are yet limited and obviously lack realism and expressivity when compared to traditional physical means. In this sense, the integration of haptics within AR may help to deliver an enriched experience, while facilitating the performance of specific actions, such as repositioning or resizing tasks, that are still dependent on the user’s skills. In this direction, this paper gathers the description of a flexible architecture designed to deploy haptically enabled AR applications both for mobile and wearable visualization devices. The haptic feedback may be generated through a variety of devices (e.g., wearable, graspable, or mid-air ones), and the architecture facilitates handling the specificity of each. For this reason, within the paper, it is discussed how to generate a haptic representation of a 3D digital object depending on the application and the target device. Additionally, the paper includes an analysis of practical, relevant issues that arise when setting up a system to work with specific devices like HMD (e.g., HoloLens) and mid-air haptic devices (e.g., Ultrahaptics), such as the alignment between the real world and the virtual one. The architecture applicability is demonstrated through the implementation of two applications: (a) Form Inspector and (b) Simon Game, built for HoloLens and iOS mobile phones for visualization and for UHK for mid-air haptics delivery. These applications have been used to explore with nine users the efficiency, meaningfulness, and usefulness of mid-air haptics for form perception, object resizing, and push interaction tasks. Results show that, although mobile interaction is preferred when this option is available, haptics turn out to be more meaningful in identifying shapes when compared to what users initially expect and in contributing to the execution of resizing tasks. Moreover, this preliminary user study reveals some design issues when working with haptic AR. For example, users may be expecting a tailored interface metaphor, not necessarily inspired in natural interaction. This has been the case of our proposal of virtual pressable buttons, built mimicking real buttons by using haptics, but differently interpreted by the study participants.

Highlights

  • In recent years, Augmented Reality (AR) has become a popular technology due to its integration in mobile apps and the increasing availability of powerful devices and development frameworks that enable the deployment of specialized solutions.Appl

  • Due to the operation mode of the Ultrahaptics, if the ultrasound is blocked with part of the hand, the haptic feedback is not properly received

  • We have presented our HARP system to enhance augmented reality perception by means of harptic elements, i.e., by adding haptic representation to AR elements

Read more

Summary

Introduction

In recent years, Augmented Reality (AR) has become a popular technology due to its integration in mobile apps (e.g., the Pokemon Go game [1]) and the increasing availability of powerful devices and development frameworks that enable the deployment of specialized solutions (e.g., for maintenance, training, or surveillance).Appl. In recent years, Augmented Reality (AR) has become a popular technology due to its integration in mobile apps (e.g., the Pokemon Go game [1]) and the increasing availability of powerful devices and development frameworks that enable the deployment of specialized solutions (e.g., for maintenance, training, or surveillance). In its most basic definition, augmented reality consists of combining digital information superimposed on a view of the real world [2]. This definition only emphasizes the visualization component, but sound interaction methods for object manipulation and task completion are key for AR service delivery. There is no dominant design yet for HMD interaction: these devices enable more natural space exploration, real-world interaction metaphors cannot be directly used.

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.