Abstract

Pose estimation and environmental perception are the fundamental capabilities of autonomous robots. In this paper, a novel real-time pose estimation and object detection (RPEOD) strategy for aerial robot target tracking is presented. The aerial robot is equipped with a binocular fisheye camera for pose estimation and a depth camera to capture the spatial position of the tracked target. The RPEOD system uses a sparse optical flow algorithm to track image corner features, and the local bundle adjustment is restricted in a sliding window. Ulteriorly, we proposed YZNet, a lightweight neural inference structure, and took it as the backbone in YOLOV5 (the state-of-the-art real-time object detector). The RPEOD system can dramatically reduce the computational complexity in reprojection error minimization and the neural network inference process; Thus, it can calculate real-time on the onboard computer carried by the aerial robot. The RPEOD system is evaluated using both simulated and real-world experiments, demonstrating clear advantages over state-of-the-art approaches, and is significantly more fast.

Highlights

  • Micro aerial vehicles (MAVs) will soon play a significant role in industrial inspection, accident warning, and national defense [1–3]

  • In the world coordinate system (w), the quadrotor MAV orientation, position, and speed can be obtained by the inertial measurement unit (IMU) measurements that are measured in the quadrotor robot body frame and are disturbed by stochastic noise n, acceleration bias ba, and gyroscope bias bω, which are approximated to a Brownian motion

  • We proposed real-time pose estimation and object detection (RPEOD): a real-time simultaneous pose estimation and object detection system for aerial robot target tracking, which combines inertial measurement and sparse optical flow tracking algorithms to estimate aerial robot motion between consecutive video frames

Read more

Summary

Introduction

Micro aerial vehicles (MAVs) will soon play a significant role in industrial inspection, accident warning, and national defense [1–3]. In the aerial robot target-tracking task, the aerial robot state estimator and object detector need to run in real time on airborne hardware equipment with limited computing and storage. We demonstrate RPEOD, a simultaneous real-time pose estimation and object detection system for aerial robot target tracking, which combines inertial measurement and sparse optical flow tracking algorithms [9] to estimate the MAV motion between consecutive video frames. The feature-based methods are mainly completed by the following steps: firstly, extracting a sparse set of image corner points (e.g., SIFT, ORB, Shi-Tomasi) in consecutive video frames; secondly, matching these corner points separately in each image using invariant feature descriptors; thirdly, recovering robot pose by spatial epipolar geometry constraint; optimizing the position and orientation through reprojection error minimization. Researchers began to use feature-based methods to incrementally estimate robot pose

Neural Inference Acceleration Technology
Inertial Measurements Preintegration
Tightly Coupled Aerial Robot Pose Estimation
Designing Efficient CNNs for Real-Time Object Detection
Channel Attention Module
Nonlinear Activation Function
Normalization of the Network Activation
Efficient Blocks for Inference
Building Efficient CNNs
Implementation for Target-Tracking System
Object Detector
Real-World Target Tracking
Findings
Conclusions and Future Work
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.