Abstract

In this paper, we design a mobile augmented reality (AR) system called TweetGlue, which overlays text messages (i.e., tweets) that are posted to a local social networking service onto live view images from cameras in mobile/wearable devices. By displaying the tweets at the current positions of the users who posted them, it supports social interaction between the users. Accurate pose tracking of mobile devices is an essential building block of such mobile AR applications. While visual features that are extracted from scenes are commonly used for vision-based pose estimation, it would not be suitable for such a mobile social AR application because the features may be often occluded by human bodies nearby. To cope with the problem, we leverage an external pedestrian tracking system using a small number of laser-based distance measurement sensors (i.e., LRS sensors) to utilize the surrounding human bodies as virtual markers for pose estimation. The mobile devices periodically analyze images from the embedded camera sensor to estimate relative positions of pedestrians in the images. By matching the estimated relative positions with accurate human location measurements by the LRS sensors, the system robustly identifies location and horizontal orientation of the devices. Through simulation experiments, we show that the TweetGlue system can accurately identify pose of mobile devices in 83% of the simulated scenarios.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call