Abstract

This paper presents a new context-aware mobile augmented reality system that provides rapid and robust on-site access to construction project information such as drawings, specifications, schedules, and budgets. The mobile augmented reality system does not need any RF-based location tracking (e.g., GPS or WLAN) or optical fiducial markers for tracking a user’s position. Rather, the user’s location and orientation are automatically and purely derived by comparing photographs from the user’s phone to a 3D point cloud model created from a set of site photographs. After generating a 3D point cloud model of construction site, field personnel can use mobile devices to take pictures of building elements and be presented on-site with a detailed list of project information related to the visible construction elements in an augmented reality format. The experimental results show that (1) the underlying 3D reconstruction module of the system generates more complete 3D point cloud models, and faster than other state-of-the-art Structure-from-Motion(SfM) algorithms; (2) the localization method is an order of magnitude more accurate than the state-of-the-art solutions, and can provide acceptable tolerances of most on-site engineering applications. Using an actual construction case study, the perceived benefits and limitations of the proposed method for on-site context-aware applications are discussed in detail.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call