Abstract

AbstractIn this paper a multiple object detection, recognition, and tracking system for unmanned aerial vehicles (UAVs) has been studied. The system can be implemented on any UAVs platform, with the main requirement being that the UAV has a suitable onboard computational unit and a camera. It is intended to be used in a maritime object tracking system framework for UAVs, which enables a UAV to perform multiobject tracking and situational awareness of the sea surface, in real time, during a UAV operation. Using machine vision to automatically detect objects in the camera's image stream combined with the UAV's navigation data, the onboard computer is able to georeference each object detection to measure the location of the detected objects in a local North‐East (NE) coordinate frame. A tracking algorithm which uses a Kalman filter and a constant velocity motion model utilizes an object's position measurements, automatically found using the object detection algorithm, to track and estimate an object's position and velocity. Furthermore, a global‐nearest‐neighbor algorithm is applied for data association. This is achieved using a measure of distance that is based not only on the physical distance between an object's estimated position and the measured position, but also how similar the objects appear in the camera image. Four field tests were conducted at sea to verify the object detection and tracking system. One of the flight tests was a two‐object tracking scenario, which is also used in three scenarios with an additional two simulated objects. The tracking results demonstrate the effectiveness of using visual recognition for data association to avoid interchanging the two estimated object trajectories. Furthermore, real‐time computations performed on the gathered data show that the system is able to automatically detect and track the position and velocity of a boat. Given that the system had at least 100 georeferenced measurements of the boat's position, the position was estimated and tracked with an accuracy of 5–15 m from 400 m altitude while the boat was in the camera's field of view (FOV). The estimated speed and course would also converge to the object's true trajectories (measured by Global Positioning System, GPS) for the tested scenarios. This enables the system to track boats while they are outside the FOV of the camera for extended periods of time, with tracking results showing a drift in the boat's position estimate down to 1–5 m/min outside of the FOV of the camera.

Highlights

  • In this paper a multiple object detection, recognition, and tracking system for unmanned aerial vehicles (UAVs) has been studied

  • The algorithm is intended to be used in an ocean surface object tracking system for UAVs, to enable UAVs to perform multiobject tracking and situational awareness in real time

  • Using onboard navigation data to get the UAV's and camera's attitude and altitude, the onboard computer is able to georeference each object detection to measure the location of detected objects in a local NED coordinate frame

Read more

Summary

| Related work

The recent increase of commercial availability of small unmanned aerial vehicles (UAVs) has led to the use of UAVs in many different applications, such as inspections of structures, surveillance, and search and rescue. Parts of the object detection and tracking system are similar to already published methods, the combination of algorithms and UAV payload presented in this paper constitutes a state‐of‐the‐art system in the robustness in which the position, speed, and course of the objects are estimated based on thermal images from a fixed‐wing UAV. The UAV object detection, tracking, and recognition module, illustrated, is a machine vision algorithm running onboard the UAV supplying the path planner with estimates of object's position and velocity. These estimates can be found using different methods. Is a more detailed view on the performance of the object detection and tracking module for each flight

| RESULTS
Findings
| CONCLUSION AND FUTURE WORK
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call