Abstract

Tracking the 6DoF pose of arbitrary 3D objects is a fundamental topic in Augmented Reality (AR) research, having received a large amount of interest in the last decades. The necessity of accurate and computationally efficient object tracking is evident for a broad base of today’s AR applications. In this work we present a fully comprehensive pipeline for 6DoF Object Tracking based on 3D scans of objects, covering object registration, initialization and frame to frame tracking, implemented to optimize the user experience and to perform well in all typical challenging conditions such as fast motion, occlusions and illumination changes. Furthermore, we present the deployment of our tracking system in a Remote Live Support AR application with 3D object-aware registration of annotations and remote execution for delay and performance optimization. Experimental results demonstrate the tracking quality, real-time capability and the advantages of remote execution for computationally less powerful mobile devices.

Highlights

  • As Augmented Reality (AR) reaches its technological maturity, its potential is unveiled by applications in a variety of fields such as industrial construction and maintenance, education, entertainment and medicine [1,2,3,4,5]

  • Our proposed AR Edge Computing architecture consists of two main actors: The client, who wants to get some augmented reality information superimposed on his perception of the physical world, and the edge server, which has the task to bring the AR experience to the user and to run the computationally challenging AR

  • We presented a novel 3D object tracking approach based on textured scans

Read more

Summary

Introduction

As Augmented Reality (AR) reaches its technological maturity, its potential is unveiled by applications in a variety of fields such as industrial construction and maintenance, education, entertainment and medicine [1,2,3,4,5]. The process of designing, developing, deploying and maintenance of AR applications locally on different mobile devices is very expensive To combat these two issues, we propose a solution based on a real-time offloading of AR computations to a high performance server using Edge Computing. Live Support system was presented based on a client-server architecture that uses the Vuforia tracking framework but requires user collaboration in setting up the tracker by taking pictures from different views around the object. It seems that the annotations of the expert are not registered in 3D but in 2D since the geometry of the objects is not fully known. The use of the pencil filter to increase the resilience of the tracking to illumination changes is advocated

Problem Formulation
Object Registration Procedure
Learning Features for Tracking
Object Tracking Algorithm
Algorithm Outline
Frame to Frame Tracking
ORB Initializer and Reinitializer
Pencil Filter
Remote Live Support Realized with Mobile Edge Computing
Mobile Edge Computing
Relevance of MEC for AR
Overview
User Side—Mobile Device
Server Side—Edge Cloud
Remote Expert
Evaluation
Tracking Quality
Runtime Measurements
Offloading Delay
Discussion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.