Abstract

This paper presents an augmented reality system for indoor environments, which does not require special tagging or intrusive landmarks. We have designed a localisation architecture that acquires data from different sensors available in commodity smartphones to provide location estimations. The accuracy of those estimations is very high, which enables the information overlay on a camera-phone display image. During a training phase, we used visual structure from motion techniques to run offline 3D reconstructions of the environment from the correspondences among the scale invariant feature transform descriptors of the training images. To determine the position of the smartphones, we first obtained a coarse-grained estimation based on WiFi signals, digital compasses and built-in accelerometers, making use of fingerprinting methods, probabilistic techniques and motion estimators. Then, using images captured by the camera, we performed a matching process to determine correspondences between 2D pixels and model 3D points, but only analysing a subset of the 3D model delimited by the coarse-grained estimation. This multisensor approach achieves a good balance between accuracy and performance. Finally, a resection process was implemented providing high localisation accuracy when the camera has been previously calibrated, that is we know intrinsic parameters such as focal length, but it is also accurate if an auto-calibration process is required. Our experimental tests showed that this proposal is suitable for applications combining location-based services and augmented reality, since we are able to provide high accuracy with an average error down to 15 cm in < 0.5 s of response time.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.