Abstract

Accurately estimating the six degree of freedom (6-DoF) pose of objects in images is essential for a variety of applications such as robotics, autonomous driving, and autonomous, AI, and vision-based navigation for unmanned aircraft systems (UAS). Developing such algorithms requires large datasets; however, generating those is tedious as it requires annotating the 6-DoF relative pose of each object of interest present in the image w.r.t. to the camera. Therefore, this work presents a novel approach that automates the data acquisition and annotation process and thus minimizes the annotation effort to the duration of the recording. To maximize the quality of the resulting annotations, we employ an optimization-based approach for determining the extrinsic calibration parameters of the camera. Our approach can handle multiple objects in the scene, automatically providing ground-truth labeling for each object and taking into account occlusion effects between different objects. Moreover, our approach can not only be used to generate data for 6-DoF pose estimation and corresponding 3D-models but can be also extended to automatic dataset generation for object detection, instance segmentation, or volume estimation for any kind of object.

Highlights

  • Beginning with the evaluation of the pose annotation, we focus on the bounding box annotation quality

  • The quality is measured in terms of the intersection over union (IoU) score, which indicates how well two bounding boxes overlap [44]

  • These approaches either base their pose estimation algorithms on bounding boxes or regions of interest predicted in a first step [45,46] or train their net to directly predict the 2D bounding box [29]

Read more

Summary

Introduction

Possible applications include real-time monitoring [2], search-and-rescue operations [3], delivery of goods [4], precision agriculture [5], and infrastructure monitoring (power grids, motorway, rail infrastructure, etc.) [6]. For the latter one, relying on UAS leads to more efficient maintenance processes. In order to achieve these efficiency improvements, the aim of the current developments is to let the UAS perform a completely autonomous flight and recording process This allows the inspection of infrastructure beyond the line of sight of a pilot. Collecting data of high quality is of utmost importance to ensure the training and validation of such algorithms

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call