Abstract
Spatial data visualization technology allows users to understand Geographic Information System (GIS) applications. Unlike traditional visualization methods, Augmented Reality (AR) inserts virtual objects and information directly into digital representations of the real world, which makes these objects and data more easily understood and interactive. However, effective AR-GIS systems and rich spatial information visualization is still a challenging task. In addition, indoor AR-GIS systems are further impeded by the limited capacity of these systems to detect and display geometry and semantic information. To address this problem, a novel AR and indoor map fusion method is proposed that automatically registers spatial information onto a live camera view of a mobile phone. We fused Bluetooth low energy (BLE) and pedestrian dead reckoning (PDR) localization techniques to track the camera positions. The proposed algorithm extracts and matches a bounding box of the indoor map to a real world scene. We render the indoor map and semantic information into the real world, based on the real-time computed spatial relationship between the indoor map and live camera view. Experimental results demonstrate that our approach accurately and richly visualizes spatial information. Our augmented reality and indoor map fusion technique effectively links rich indoor spatial information to real world scenes in AR integrated GIS.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.