Abstract

We present a mobile augmented reality system for outdoor annotation of the real world. To reduce user burden, we use aerial photographs in addition to the wearable system's usual data sources (position, orientation, camera and user input). This allows the user to accurately annotate 3D features with only a few simple interactions from a single position by aligning features in both their first-person viewpoint and in the aerial view. We examine three types of aerial photograph features - corners, edges, and regions - that are suitable for a wide variety of useful mobile augmented reality applications, and are easily visible on aerial photographs. By using aerial photographs in combination with wearable augmented reality, we are able to achieve much higher accuracy 3D annotation positions than was previously possible from a single user location.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call