Abstract
Developing augmented reality (AR) applications for mobile devices and outdoor environments has historically required a number of technical trade-offs related to tracking. One approach is to rely on computer vision which provides very accurate tracking, but can be brittle, and limits the generality of the application. Another approach is to rely on sensor-based tracking which enables widespread use, but at the cost of generally poor tracking performance. In this paper we present and evaluate a new approach, which we call Indirect AR, that enables perfect alignment of virtual content in a much greater number of application scenarios.To achieve this improved performance we replace the live camera view used in video see through AR with a previously captured panoramic image. By doing this we improve the perceived quality of the tracking while still maintaining a similar overall experience. There are some limitations of this technique, however, related to the use of panoramas. We evaluate these boundaries conditions on both a performance and experiential basis through two user studies. The result of these studies indicates that users preferred Indirect AR over traditional AR in most conditions, and when conditions do degrade to the point the experience changes, Indirect AR can still be a very useful tool in many outdoor application scenarios.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.